Bulgaria’s Pre-school and School Education Act (2016) introduced a new competency-based curriculum, new learning standards and more formative approaches to assessing students, including the use of start-of-year diagnostic tests and the implementation of qualitative marking. However, while these policies have the potential to enhance learning, changes in school and classroom assessment practices have been slow to implement and the country’s high-stakes selection and examinations culture continues to reinforce the perception of student assessment as a primarily summative exercise. This chapter recommends tangible steps Bulgaria can take to use assessment as a means to improve teaching and learning.
OECD Reviews of Evaluation and Assessment in Education: Bulgaria
2. Making student assessment an integral part of student learning
Abstract
Introduction
Student assessment can be a key enabler of student learning by helping teachers, students and parents determine what learners know and what they are capable of doing. This information may also help educators identify specific learning needs before they develop into more serious obstacles, as well as support students in making informed decisions about their educational trajectories. Meaningful assessment practices are especially important in light of the COVID-19 pandemic, facilitating the adaption of instruction where learning has been disrupted. In Bulgaria, student assessment policies have undergone several changes in the last six years as part of broader reform efforts introduced by the Pre-school and School Education Act (2016) to modernise schooling. In particular, the new competency-based curriculum provides a foundation for achieving national goals and improving the educational outcomes of all students, supporting them and the country to advance.
However, the intended impact of Bulgaria’s education reforms has not yet come to fruition and there is a notable implementation gap at the school and classroom levels. Schools and teachers will need to make assessment a central part of the learning process in order to better detect and address learning issues, redress inequalities related to background or location and promote the complex competencies needed for success in school and beyond. The role of standardised assessment in Bulgaria also needs to be reviewed: national assessments and examinations are not currently contributing to improvements in the quality of education in the classroom or to the choices students make about their pathways. These factors exist within a highly competitive and traditional schooling environment whereby assessment is primarily viewed as a way to sort students into prestigious schools. Overcoming these challenges and establishing a more inclusive and competency-based approach to education will likely require further structural changes to schooling in Bulgaria. In the meantime, there are tangible steps the country can take to use assessment as a means to improve teaching and learning practices and outcomes.
Student assessment in Bulgaria
Bulgaria’s competency-based curriculum introduced important changes to student assessment policy, such as start-of-year diagnostic tests, qualitative marking and expected learning outcomes for each subject and grade level. While these policies have the potential to enhance the quality of education, tangible changes in school and classroom practices have been slow to take effect. Teacher assessments continue to focus on traditional summative tests with a narrow emphasis on a limited range of tasks as opposed to broader, deeper learning. This encourages an educational approach that risks undermining student agency, engagement and progress. The ability of teachers to adopt new assessment practices is constrained by political and public expectations of how students in Bulgaria should be assessed and successful achievement demonstrated. This is, in part, a cultural legacy of education under the Soviet bloc, which was characterised by centralisation, control and a focus on memorisation combined with a culture of competition and performance in contests and examinations. This is not unique to Bulgaria: several other education systems in the region have confronted or are confronting similar challenges (Li et al., 2019[1]; Kitchen et al., 2017[2]; OECD, 2019[3]).
Bulgaria’s national assessment and examination practices, which centre on a high-stakes sorting and examinations culture, have further entrenched these more traditional attitudes. The wider assessment ecosystem is not conducive to implementing the intended changes of Bulgaria’s competency-based curriculum. To move forward, the country needs to create the conditions for teachers and students to take the lead on assessment practices that enable learning. This requires shifting the focus from summative, high-stakes assessment to emphasising formative practices and an improvement-led assessment culture from the earliest years. Supporting teachers in developing these pedagogical skills while simultaneously changing public attitudes towards assessment – and education more broadly – will be key to building buy‑in for the new competency-based curriculum among educators and society at large. This chapter will discuss the different types of student assessment practices currently found in Bulgaria (Table 2.1) and identify elements that could support the country in making this shift. Some details are covered in more depth in Chapter 5, which looks further into how Bulgaria’s national assessments can support system-level monitoring and help advance national education goals.
Table 2.1. Overview of student assessment in Bulgaria
Reference standards |
Types of assessment |
Body responsible |
Process |
Guideline documents |
Frequency |
Primary use |
---|---|---|---|---|---|---|
National Curriculum Framework |
Classroom assessment |
Teachers |
School-readiness assessment at the end of pre-primary education |
State Educational Standard (SES) for Evaluation of the Results of Student Learning (2016) |
Once |
Certification of readiness for transition to primary education |
Start-of-year readiness/ diagnostic assessment |
Once a year |
Assessing gaps in learning |
||||
Continuous assessment (current and term assessment) |
Up to four times per term (subject-dependent) |
Monitoring student progress during a school year |
||||
End-of-year/end-of-phase examination |
Once a year |
Completion of grade level/phase |
||||
National assessment |
The Center for Assessment |
Census-based National External Assessments (NEAs) (Grades 6, 7, 10) |
SES for the Evaluation of the Results of Student Learning (2016) |
Three in total; takes place annually |
Monitoring system performance Selection mechanism for upper secondary school (Grade 7) |
|
National examination |
The Center for Assessment |
State Matriculation examination (Grade 12) |
SES for the Evaluation of the Results of Student Learning (2016) |
Once |
Diploma of completion of upper secondary education; application to tertiary education |
|
State Matriculation examination for acquiring professional qualification (Grade 12) |
Vocational Education and Training Act (latest amendments: 2018) |
Once |
Certification of acquisition of vocational qualification |
|||
International Association for Evaluation of Educational Achievement (IEA) Standards |
International assessment |
International Association for Evaluation of Educational Achievement (IEA) |
Progress in International Reading Literacy Study (PIRLS) (Grade 3) |
Five years |
Measurement of system performance |
|
Trends in International Mathematics and Science Study (TIMSS) (Grades 3, 7) |
Four years |
Measurement of system performance |
||||
International Programme for International Student Assessment (PISA) Standards |
International assessment |
OECD |
PISA (15-year-olds) |
Three years |
Measurement of system performance |
Source: Ministry of Education and Science (2020[4]), OECD Review of Evaluation and Assessment: Country Background Report for Bulgaria, Ministry of Education and Science of Bulgaria, Sofia.
Overall objectives and policy framework
High-performing education systems successfully align curriculum expectations, subject and performance criteria and desired learning outcomes (Darling-Hammond and Wentworth, 2010[5]). National learning goals and expected outcomes, as expressed through qualifications frameworks, curricula and learning standards, help establish an education culture within which assessment supports learning. Bulgaria’s reforms under the Pre-school and School Education Act signalled a clear effort to establish a coherent, learner- and learning-focused policy framework. However, more than five years on, the changes to teaching and learning envisaged by the reform have not yet materialised in classrooms, nor has the desired effect on student outcomes. Considerable gaps between the intended curriculum, the taught curriculum and the assessed curriculum persist and further implementation and alignment efforts are required.
The new curriculum aligns with international frameworks and continues to be updated
Bulgaria’s move towards a competency-based curriculum aims to modernise teaching and learning, in line with international trends, emphasising the mastery and practical application of knowledge and skills, as well as reorienting the teacher’s role from a source of information to that of a mentor or learning partner (Government of Bulgaria, 2020[6]). The Pre-school and School Education Act established nine interdependent and transversal competencies to be embedded across school education for both general and vocational education and training (VET) programmes. These competencies reflect the European Parliament and Council of Europe’s Recommendation on Key Competences for Lifelong Learning (2006, updated 2018), with the addition of sustainable development, healthy lifestyles and sports competency. Efforts were made to take into account and align with international competency frameworks to ensure Bulgarian students have opportunities to study and work abroad after school and that their skills are competitive internationally.
The introduction of the new curriculum, overseen by the Directorate for the Content of Pre-school and School Education within the Ministry of Education and Science (hereafter, the Ministry), has been gradual. The 2021/22 academic year marks the first time that all students in Bulgarian schools are following the new curriculum. Further curriculum updates are planned with the goal of developing more flexible and modular VET programmes and updating general curricula to better promote the key competencies (Government of Bulgaria, 2020[6]). Bulgaria’s National Recovery and Resilience Plan1 also commits to better promoting science, technology, engineering and mathematics (STEM) skills as well as further developing core cognitive skills (Government of Bulgaria, 2020[7]). While these are important developments, it is essential that further reform efforts do not distract from the implementation and consolidation of previously updated curricula. Change takes time and a sustained focus on core priorities is important for impact and for avoiding curricular reform fatigue among teachers, trainers and school leaders, which can make improvements even harder to achieve. Specifically, Bulgaria will need to prioritise classroom-level curricular implementation for younger students and in priority competencies, such as language literacy and mathematical and scientific competency, to ensure that all students are supported in developing the core attitudes and skills that provide the foundations for future learning.
Multiple instructional documents aim to guide the organisation of teaching and learning but can lack clarity and coherence
As part of broader education reforms, Bulgaria introduced a range of new policy documentation relating to the organisation and content of teaching and learning (Table 2.2). The State Educational Standard (SES) for General Education sets out expected learning outcomes by the end of each education phase in every subject. The Framework Curricula, included in the SES for Curriculum, set out organisational aspects for different types of education (i.e. by school or programme type and delivery mode) at each education phase and for each subject. Grade-level subject syllabi are intended to guide teachers’ classroom planning. For the first time, these documents provide expected learning outcomes related to subject competency as well as suggested activities that teachers can do to support the development of these competencies, the share of time dedicated to assessing students and the different modes of assessment to be employed (e.g. continuous assessment, examination, homework, projects, etc.). They also identify links between subject competencies and the nine transversal competencies. While these are all important and necessary resources and many appear of good quality in and of themselves, there is a lack of clarity among teachers as to the role of each one and a lack of coherence among the documents themselves.
Table 2.2. Policy documentation to support schools and teachers in organising and planning learning
Policy document |
Date |
Purpose |
Content |
---|---|---|---|
Pre-school and School Education Act |
2015 |
To establish the overall aims and objectives of pre-school and school education. |
|
SES for General Education |
2015 |
To determine the goals, content and characteristics of general education at the school level. |
|
SES for Curriculum |
2015 |
To set out the characteristics, content and organisational structure of the curriculum. |
|
Subject syllabus |
2016-21 |
To establish the requirements and expected learning outcomes for every subject at every grade. |
|
School curriculum |
Annual |
To determine the organisation of school curricula. |
|
Individual curriculum |
Annual |
To determine curricula for students with specific needs. |
|
Teachers struggle to navigate curriculum documents and apply changes to their classroom practice
Teachers implementing Bulgaria’s new curriculum have been provided with more curricular information than ever before. However, a sense of confusion about the role of the various documents prevails, as well as a perception that the curriculum is overloaded (Ministry of Education and Science, 2019[8]). Interviews undertaken by the OECD review team also indicate that, rather than using the syllabi as intended, teachers continue to rely heavily on textbooks for their planning, teaching and assessment of learning. While the Ministry perceives that the expected learning outcomes act as learning standards, teachers do not consistently apply them in the classroom to support student assessment and there is little monitoring or accountability to incentivise them to do so (Ministry of Education and Science, 2020[4]).
This misapplication may be partly due to ambiguity in the content of the outcomes. Although most are defined as expected results and some are process- or skill-focused, others better describe teaching activities or specific content knowledge (Dimitrova and Lazarov, 2020[9]). For example, in the Grade 6 history and civilisations syllabus, students are expected to “determine causes and consequences of historical events, and research and select information via the Internet” (process/skill-focused), but also “know the most significant conflicts of the period and describe historical figures” (content knowledge) (Ministry of Education and Science, 2016[10]). Furthermore, the suggested teaching and assessment approaches can be very generic and are often repeated across grades and subjects. Teachers of Grade 12 mathematics, for instance, are told that assessment can take the form of an oral examination, written test, classwork or practical work (Ministry of Education and Science, 2020[11]). This adds no value to the information included in higher-level documents.
Each of these challenges – perceived overload, use of textbooks over syllabi and low application of the expected learning outcomes – suggests a considerable gap between the uses of the curricular documentation as intended by the Ministry and the real-life application as carried out by schools and teachers. The Ministry has tried to address these challenges, for example by publishing informative brochures and running a set of regional workshops in 2019, yet the disparities between the intended and implemented curriculum continue to impede the overall success of Bulgaria’s curricular reforms.
A new national evaluation and assessment framework provides detailed instructions regarding the organisation and administration of assessments
Complementing various curricular documents, Bulgaria also introduced a new student assessment framework in 2016, the SES for the Evaluation of the Results of Student Learning (Ordinance 11). This Ordinance aims to align student assessment practices with a competency-based approach, namely by encouraging a greater focus on diagnosing and monitoring student progress across the school year. Specifically, the framework establishes the main types (normative, criterion and mixed) and forms (diagnosis, prognosis, certification, information, motivation, selection) of assessment, as well as how to organise classroom- and school-level assessment, National External Assessments (NEAs), State Matriculation examinations and the certification of learning across education phases (Ministry of Education and Science, 2016[12]).
Ordinance 11 introduces some important changes to Bulgaria’s more traditional student assessment approaches, including the use of qualitative marking and diagnostic assessments in classrooms. However, it also remains focused on the organisational elements of different assessments, such as detailed requirements for timing, frequency and administration. Despite the fact that a move to a competency-based curriculum requires changes in the pedagogical approach to assessment, Ordinance 11 offers minimal information or guidance to support teachers to make such changes.
Implementing competency-based assessment remains a challenge
The introduction of a competency-based curriculum poses a challenge to student assessment practices in any education system because competencies are difficult to assess: they combine knowledge, skills and attitudes and are underpinned by dimensions that are hard to capture but are learned simultaneously (EC, 2010[13]). There are some specificities in Bulgaria’s education system that may have made this shift towards more multi-dimensional assessment even harder to achieve. First, school-level assessments are currently constrained by multiple intermittent, often high-stakes, traditional assessments of student learning, as prescribed in Ordinance 11. These approaches create a negative backwash effect on the curriculum, as “teaching to the test” narrows the focus of learning in the classroom (OECD, 2013[14]). Moreover, Bulgaria’s extensive and frequent changes to curricular documentation may be reducing the space, or at least perceived space, for teachers to understand and engage in more innovative assessment practices. For example, during interviews conducted by the OECD review team, teachers implementing project-based learning in primary education expressed concerns about replicating these approaches in Grades 5-7 when classroom assessment carries consequences for students’ progression and preparations for the high-stakes external assessment in Grade 7 begin.
At the same time, Bulgarian teachers face a highly traditional educational culture among the wider public that emphasises high-stakes assessment and quantitative marking. Both system and institutional actors reported to the OECD team that they have tried to reduce reliance on traditional assessments and increase more competency-based approaches (e.g. projects or case studies) but such efforts often lead to complaints from parents. This context may influence teachers to avoid changing instruction altogether or to implement changes while maintaining traditional types of assessment, meaning more classroom time dedicated to administering assessments as opposed to acting upon the results to enhance learning.
Professional capacity in Bulgaria is another obstacle. When introducing a competency-based approach, systems need to develop the expertise and technical capacity of teachers to design, develop, deliver and evaluate more complex assessments (Nusche et al., 2014[15]). This requires training for teachers but also for other actors in the system such as, in Bulgaria’s case, those working in national assessment agencies or those based in the regional departments of education (REDs) that offer methodological support to schools. However, training for the new curriculum in Bulgaria has been limited to teaching professionals only and has been knowledge-focused as opposed to pedagogy-focused, meaning that assessment practices may have been neglected. Although some specific assessment-focused training is available to teachers in Bulgaria, it is rare and focuses on preparing students for national or international examinations and assessments. Even in cases whereby teachers are creating their own assessments, these appear to be about measuring the acquisition of knowledge as opposed to measuring competencies.
Classroom assessment
Ongoing and regular identification and interpretation of evidence about student learning is a key component of effective instruction (Black and Wiliam, 2018[16]). In Bulgaria, however, alongside the over-reliance on traditional formats, classroom assessment is often viewed by teachers and students – and society – as a validation exercise rather than an integrated part of the learning process.
Teachers in Bulgaria must administer frequent classroom assessments
The purpose of classroom assessment in Bulgaria, as defined in Ordinance 11, is to establish students’ educational outcomes and determine their progress. To this end, teachers are expected to undertake frequent classroom assessments during the academic year (Table 2.3). The school year begins with a diagnostic assessment for all students to ascertain entrance levels of performance and identify areas for support. Following this, regular assessments must take place for all students to determine current marks. The frequency is dependent on the number of subject teaching hours per week and can amount to four assessments per academic term for core subjects such as mathematics and Bulgarian language and literacy. These assessments can be oral, written or practical and administered individually or by group.
Table 2.3. Different types of classroom assessment administered to students in Bulgaria
Type of assessment |
Purpose |
Scope |
Timing |
Format |
---|---|---|---|---|
Diagnostic assessment |
To establish entrance level and assimilation of key concepts from the previous year, identifying deficits and measures to overcome them |
All students |
Within three weeks of the start of the school year |
Written test |
Continuous assessment |
To establish the student’s current mark and to support the achievement of the expected learning outcomes |
All students |
Between two and four times an academic term |
Oral, written and practical tests, and according to the scope - individual and group |
Equivalency examinations |
To support the transition of an upper secondary student from one class or school to another |
Students transitioning from one school or pathway to another |
Written test |
|
Corrective examinations |
To enable students who receive a poor mark (2) an opportunity to improve their annual grade |
Students who receive a 2/“poor” mark in end-of-year assessments |
Annually from Grade 5; from two weeks after the end of the school year and two weeks before the start of the next one |
|
Resit examinations |
To give students the opportunity to change their end of stage assessment |
Students who want to improve their end-of-year assessments |
End-o- phase – Grade 7, Grade 9, Grade 12 |
Three subjects maximum, no resits except in Grade 12 |
Source: Adapted from Ministry of Education and Science (2016[12]), Наредба No. 11 от 1 Септември 2016 г. за Оценяване на Резултатите от Обучението на Учениците [Ordinance No.11 of 01 September 2016 for the Evaluation of the Results of Student Learning], https://www.lex.bg/en/laws/ldoc/2136905302 (accessed on 18 August 2021).
Bulgarian teachers use qualitative and quantitative descriptors when assessing students
When conducting classroom assessments in Bulgaria, teachers of students from Grades 1-12 must assign a qualitative descriptor (excellent, very good, good, intermediate or poor). Ordinance 11 provides generic descriptions for these. For example, an “excellent” should be awarded only to students who “achieve all the expected results from the curriculum, and master and independently apply all new concepts”. For students in Grades 4-12 only, this qualitative descriptor must be paired with a numerical mark (“excellent” equates to a mark between 5.50 and 6.00; “poor” equates to a mark between 2.00 and 2.99). For continuous assessments, teachers must report results to students within two weeks of administering the test and enter them into the relevant school information system.
Bulgaria’s introduction of qualitative descriptors is positive and could support students and teachers to better contextualise numerical marks within the learning process. Moreover, the exemption of younger students from receiving numerical marks is in line with other countries in the region although numerical marks are introduced earlier (e.g. Grade 2 in Serbia) or later (e.g. Grade 5 in Georgia and Romania), depending on the system. However, without student-, subject- and task-specific clarification, Bulgaria’s qualitative descriptors cannot direct students on how to improve. Teachers are not required to formally record nor report such targeted feedback so while students receive their marks promptly, these marks are not always justified (Ministry of Education and Science, 2020[4]). Teachers in Bulgaria also seem focused on numerical marks: interviews undertaken by the OECD review team suggest that even when assessing project-based learning, teachers developed complex formulae to calculate a student’s mark. This may reduce the impact of written comments by becoming the main focus of learners’ attention (Elliott et al., 2016[17]).
As well as assigning qualitative and quantitative descriptors for continuous assessments, teachers must assign an end-of-term (Grades 4-12) and end-of-year evaluation (Grades 1-12). In Grade 1, this is a general mark for all subjects; from Grade 2, marks are awarded for each subject. These evaluations should be based on both the student’s performance in continuous assessments and a final examination. The lowest value, “2” or “poor”, is considered a “fail” and requires either additional support only (Grades 1-3) or additional support and a resit examination if awarded at the end of the year (Grades 4-12). If the mark does not improve in the resit examination, students must repeat the school year.
This emphasis on achieving a better mark in order to proceed to the next grade, as opposed to focusing on ensuring a fuller mastery of the subject, has the potential to narrow learning further. Moreover, PISA data indicate that this policy is not effectively supporting the remediation of learning gaps. Grade repetition is not common in Bulgaria: only 4.5% of students participating in PISA 2018 reported having repeated a grade, which was below the OECD average of 11.4% (OECD, 2020[18]). While this is positive – grade repetition is both educationally and financially inefficient – given that PISA data also indicate that around a third (32%) of Bulgarian 15-year-olds failed to meet minimum proficiency levels in any of the three core PISA disciplines (reading, mathematics and science), many Bulgarian students appear to be advancing through the school system without having learning gaps identified and addressed. This raises several concerns about the focus on examinations and numerical marks over learning, the accuracy of teachers’ judgements and the extent to which assessments evaluate important knowledge and skills.
Students are also awarded a final evaluation at the end of each education phase, which is entered on the relevant certificate of completion. Particularly for lower education phases, the inclusion of end-of-year results on certificates of completion is not common among OECD countries or other countries in the region. This practice means that even Bulgaria’s continuous assessments have high-stakes consequences because they feed into the end-of-year evaluations that determine progression to the next grade level, appear on certificates and, in some grade levels, inform competitive selection processes for school places. This practice risks undermining more formative forms of assessment.
Formative assessment is not consistently applied in classrooms
In many education systems, the move to competency-based curricula has been paired with more formative approaches to assessment. In addition, there have been efforts to create a better balance between this and summative assessment in the classroom, recognising that both play a role in student learning. Bulgaria’s Ordinance 11 establishes formative approaches to assessment, such as the use of start-of-year diagnostic tests. Such tests can produce detailed information about individual students’ strengths and weaknesses and should inform future planning, differentiated instruction and remedial efforts (OECD, 2013[14]). In the wake of school closures and disrupted instruction during the COVID-19 pandemic, such efforts are particularly valuable (OECD, 2020[19]). In Bulgaria, there is scope to expand formative approaches with younger learners as teachers cannot assign numerical marks to students in Grades 1‑3 and there is an explicit expectation to implement remedial measures in the case of “poor” performance.
In reality, for both younger and older students in Bulgaria, formative classroom assessment is not a common practice and there appears to be some misunderstanding among teachers about the difference between summative and formative assessment methods and how they are interrelated. For example, some practitioners who spoke with the OECD review team did not distinguish between formative assessment and continuous assessment. In fact, continuous assessment can serve both summative and formative purposes (Muskin, 2017[20]). Furthermore, the start-of-year diagnostic assessments are not consistently applied and do not always serve the intended purposes (i.e. identifying gaps in students’ learning, tailoring teaching and learning to students’ needs, or supporting evidence-based progress-focused conversations between teachers, learners and parents). Other countries mandating diagnostic assessments (e.g. Romania and Serbia) face similar challenges (Maghnouj et al., 2019[21]; Kitchen et al., 2017[2]). In Bulgaria, teachers appear more likely to use diagnostic tests to establish an entry-level mark, with a view to comparing this to an exit-level mark at the end of the school year. Some subject syllabi even appear to promote this approach (Ministry of Education and Science, 2017[22]) while, during interviews conducted by the OECD review team, teachers sometimes referred to facing resistance from parents when recommending their child receive remedial instruction following the diagnostic test. Although there are some effective remediation efforts within the system, such as the Support for Success programme, these also reinforce the idea that remediation is an additional support mechanism rather than being a key element of effective assessment cycles within classroom practice.
Bulgaria’s assessment policy framework may be contributing to these misconceptions or misapplications. Ordinance 11 lacks clear comparative definitions of formative and summative assessment that outlines their distinct roles. Although ultimately the two approaches are synergic and cannot be clearly separated (Black and Wiliam, 2018[16]), for teachers working in a system under transition, clarification around the two approaches would be useful. Furthermore, by requiring very regular continuous assessment with numerical marks, Ordinance 11 directs teachers to implement assessments that emphasise performance as opposed to process or improvement. There is also little time within the assessment schedule for formative feedback loops, particularly given that Bulgaria’s academic year is comparatively short (EC/EACEA/Eurydice, 2020[23]) and that teachers perceive the curriculum to be overloaded. Although there is some reference to the formative function of assessment within Ordinance 11, time pressures make realising this seem unlikely. For example, between the end-of-year examinations and corrective examinations, there may only be two weeks for remedial efforts.
National assessments
National assessments are designed to provide nationally comparable information on student learning, principally for system monitoring. As such, Bulgaria’s national assessments are covered primarily in Chapter 5 of this report. Like examinations, national assessments are usually externally designed and administered but, unlike examinations, they do not carry consequences for students’ progression. In addition to enabling national system monitoring of learning outcomes, they can also serve other purposes, such as ensuring that students meet national learning standards and supporting broader school accountability efforts. Across the OECD, the vast majority of countries (around 30) have national assessments to provide reliable data on student learning outcomes that is comparative across different groups of students and over time (OECD, 2015[24]). Bulgaria’s national assessment does not currently measure progress over time and has limited pedagogical value. Moreover, the assessment’s selection function has been criticised for pressuring students and encouraging a narrow focus on test preparation. These features not only prevent the national assessment system from serving either monitoring or formative functions but also risk having an adverse effect on students who do not plan to attend competitive, elite upper secondary schools.
Bulgaria’s national assessment system has significant implications for students
Students in Bulgaria sit census-based national assessments at three key transition points in their schooling: Grade 4 (end of primary education), Grade 7 (end of lower secondary education) and Grade 10 (end of compulsory education). These National External Assessments (NEAs) are developed and administered by the Center for Assessment of Pre-school and School Education (hereafter, the Center for Assessment). All students are assessed in mathematics and Bulgarian language and literature and some choose to take assessments in foreign languages. The NEA uses a single test instrument to serve multiple purposes, including system monitoring and identifying individual student progress (see Chapter 5).
In some respects, the NEA reflects national assessment systems found in other European Union (EU) and OECD countries; however, a unique feature of Bulgaria’s NEA is that it can have important implications for individual students. In all three grades, NEA results are entered onto the student’s certificate of completion for the education phase, although a minimum level is not required for phase completion. For a small share of Grade 4 students, NEA results help determine academic selection into high-performing, elite schools that specialise in mathematics or foreign languages. For a similarly small number of Grade 10 students, specifically those transitioning from an integrated school to a school that offers the second stage of upper secondary, the NEA also informs admission processes. The implications of NEA results in Grade 7 are much more significant, as explained below. For this reason, although the Grade 7 NEA is covered in detail as a system evaluation tool in Chapter 5, it also needs to be taken into account when reviewing how effectively assessments and examinations are supporting learning at the level of individual students. The fact that the NEA also has some consequences, for teachers and schools (see Chapters 3 and 4), means that its influence on the teaching and learning that takes place in the system is significant.
National examinations
National examinations are centrally developed standardised assessments that have formal consequences for students. In Bulgaria, the State Matriculation examination in Grade 12 certifies student achievement at the end of upper secondary education and supports progression to tertiary education, for example by allocating state scholarships. Most OECD countries administer national examinations at the end of upper secondary education for one (or both) of these purposes; however, national examinations are becoming less common at other key transition points, as policy makers seek to remove barriers to progression and reduce early tracking (Maghnouj et al., 2020[25]). This is not the case in Bulgaria where the Grade 7 NEA acts as a national examination at the end of lower secondary education.
The Grade 7 NEA acts as a national selective examination to allocate students to upper secondary education
The Grade 7 NEA has two key uses. The first is to assess student proficiency in core skills, which helps to fulfil a system monitoring function and determine whether students have achieved the minimum standards required to graduate and progress from lower secondary education. The second use, which is more challenging, aims to inform the placement of students into upper secondary school. The selection process sees students access their NEA results on line then apply to an unlimited number of schools of their choice. REDs determine a minimum score required for entry into each school, based on students’ Grade 7 NEA results and teacher-assigned marks for mathematics and Bulgarian language and literature. The weighting of results is at the discretion of each school so that a profiled school with mathematics and science pathways may place more weight on mathematics results. Students are then offered a school place and if they do not accept the offer, they enter a second round of selection, then a third and so on until all students have been placed.
Under this ranking system, around two-thirds of students get their first choice of school (see Figure 2.1). This suggests that many students are not applying to over-subscribed or highly selective schools and so the competitive pressures of the examination are not the same for all students. However, a small share – around one in ten – participate in the ranking process more than five times and a considerable share – around one in four in 2020 – go through it three times or more. This could signal that either these learners have not been sufficiently supported to apply for schools or programmes that realistically suit their abilities or that the opportunities available to them are limited because the schools perceived to be of higher quality are over-subscribed and highly selective, for example, or because other available schools are an unattractive choice.
The high stakes associated with the Grade 7 NEA have implications for educational quality and equity
Interest in the Grade 7 NEA results, known as the “Little Matura” to the general public, is intense among parents and the media alike. From the view of broader society, enabling students to transition to a good school is now the NEA’s main role (Dimitrova and Lazarov, 2020[9]). In 2019/20, when the COVID-19 pandemic led to school closures and learning disruption, Bulgaria’s ombudsman proposed cancelling the Grades 4 and 10 NEAs, backed by a petition signed by 18 000 people (Kovacheva, 2020[26]); there was no public discussion about cancelling the Grade 7 NEA.
Until 2010, the Grade 7 NEA was explicitly designed in two parts: a compulsory part 1 determined minimum proficiency in core skills across all students; an optional part 2 fed into the competitive selection process and was only required for students applying to specific elite schools – about 40% of the cohort. Now, even though all students must participate in the selection process, disparities in educational outcomes across Bulgaria’s school network mean that, for students in rural areas at least, school choices are limited and competition for places varies considerably. Although students have the option to apply to schools outside their region, as schools with higher educational outcomes tend to be located in urban areas and clustered in Sofia, only those students with the means to travel or leave home for upper secondary education can access these opportunities. This process raises equity concerns and means that Grade 7 in general, and the NEA in particular, carries high stakes for many students. Moreover, it indicates that the large share of students getting their first “choice” may mask significant disparities in opportunity.
Despite these concerns, some teachers maintain positive attitudes towards the Grade 7 NEA, identifying it as an important factor in motivating students, testing their capacity to perform under stress and facilitating upper secondary teaching by grouping students by ability. While this may be true, the high-stakes nature of the Grade 7 NEA has considerable negative implications for the education system. First, in response to the pressure on students in Grade 7, families may engage in private tutoring. Although evidence and data regarding the extent of private tutoring in Bulgaria are scarce, anecdotal evidence reported to the OECD review team indicates that, among families that can afford it, private tutoring in the months – or even years – leading up to the Grade 7 NEA is widespread. Moreover, this is a common practice in neighbouring countries which also have high-stakes examinations at key transition points (Kitchen et al., 2019[27]; 2017[2]). Internationally, such practices have been seen to increase the achievement gap between advantaged and disadvantaged students (Zwier, Geven and van de Werfhorst, 2021[28]).
Furthermore, while having a greater variety of school types and programmes can cater for the diverse needs of students, without careful regulation and implementation, it can also increase horizontal stratification as students’ background may inform decisions about school choice more strongly than their interests or aptitudes. As shown in Figure 2.2, Bulgarian schools are more highly segregated along socio-economic lines than in any OECD member country. On paper, Bulgaria has up to 10 different school types available to students in upper secondary education and 14 different curricula pathways through the profiled subjects, offering students the greatest level of choice among EU countries (EC/EACEA/Eurydice, 2020[29]). However, academic selection means that educational pathways are often decided at age 13 and that real choice by the time students reach upper secondary level is highly constrained. While there may be advantages to providing older students with a range of pathway choices that can better tailor to their strengths, needs and ambitions, very early tracking, as seen in Bulgaria, has been shown to strengthen the association between socio-economic background and achievement and widen the learning differences between students (EC/EACEA/Eurydice, 2020[29]; Levin, Guallar Artal and Safir, 2016[30]; Woessmann, 2009[31]).
Bulgaria’s Grade 7 NEA and associated selection process may also be inhibiting educational quality in other ways. International research indicates that the existence of academically selective schools does not have a positive association with a school system’s overall performance (Andrews, Hutchinson and Johnes, 2016[33]). In fact, some research suggests that academic streaming and specialisation are much more common in low-performing education systems (Daniell, 2018[34]). At the same time, the NEA may inhibit the implementation of the competency-based curricula as the high-stakes nature can have a distorting effect on the curriculum. Finally, as the assessment does not yet assess competencies in a meaningful sense, teachers and students are less motivated to spend learning time on these skills.
State Matriculation examination results certify completion of upper secondary education and support progression to tertiary education
Bulgaria’s State Matriculation examinations perform several functions. Since the school year 2007/08, students in Bulgaria must pass the State Matriculation examination in order to certify completion of upper secondary. Students who successfully complete this examination, and their upper secondary education courses, receive a diploma of upper secondary education. However, the State Matriculation examination is not compulsory; students who do not take or pass the examination are still awarded a certificate of completion of Grade 12 with which they can progress into post-secondary vocational education programmes.
The State Matriculation examination also supports progression into higher education. All students applying to tertiary education must have successfully passed the State Matriculation examinations and many universities or university programmes use the results from the State Matriculation examination as part of their specific criteria for selection and enrolment. This aligns with international practices: most OECD countries have centralised examinations at the transition point between schooling and tertiary education (OECD, 2017[35]) and an increasing number of countries use a single examination for both school graduation and university selection purposes. Nevertheless, in Bulgaria, some universities or faculties continue to set their own examinations or selection criteria; this includes the most competitive ones (e.g. medicine).
All students opting to take the State Matriculation examination must take Bulgarian language and literature and, as of 2012, a second compulsory examination in a subject of their choice. Students also have the option to take the examination in an additional two subjects. For students in general education who have studied profiled programmes, the additional subjects must come from among their profiled subjects (e.g. a foreign language). Compared to other countries in the region, most students in Bulgaria sit fewer examinations and with a narrower coverage of the curriculum. In Albania, North Macedonia and Romania for example, alongside optional subjects, national examinations at the end of upper secondary education have three compulsory subjects: the native language, a foreign language and mathematics (or computer skills in Romania). In recent years, the most popular elective subjects among Grade 12 students in Bulgaria were English, and biology and health education. Very few students opted to take physics and astronomy or the chemistry and environmental protection examinations, and only 7% choose to take mathematics, subjects more aligned with Bulgaria’s national priorities to enhance STEM skills (Figure 2.3). The OECD review team heard that this may be due to students opting to take subjects that are perceived to be less demanding.
Finally, results from the State Matriculation examinations are used to award state scholarships for students progressing to higher education in public universities. For a student to be able to apply to receive one they must perform among the top 10% of students in Bulgarian language and literature and at least meet the national average in their second subject. Alternatively, for mathematics, physics and astronomy or chemistry and environmental protection, they must come in the top 30% of students sitting the examination and meet or exceed the national average in Bulgarian language and literature. The government prioritises certain courses or fields for state scholarships; these are decided annually by the Council of Ministers.
Administration and marking of the State Matriculation examination is highly trusted
Development and administration of the State Matriculation examinations are overseen by the Center for Assessment and processes are tightly controlled and carefully monitored (Table 2.4). Numerous expert and technical commissions annually carry out different stages of the design, administration and marking process. There are also high-security measures such as video surveillance in examination centres and police escorts for the movement of papers. This has helped to build a high level of public confidence in the process over a reasonably short amount of time. The Center for Assessment has also been working to strengthen the State Matriculation examination’s validity, reliability and integrity to encourage higher education institutions to accept results as a metric for admissions decisions. These efforts have been successful: currently, 38 out of 52 higher education institutions in Bulgaria accept the results as an entry requirement for their programmes, although they may also choose to apply additional criteria. As explained above, those that do not accept the exam’s results, tend to be the most competitive institutions or programmes. However, there are signs that this is changing too: in 2021, a Council of Ministers decision formally enabled law faculties to accept undergraduate students solely based on the results of the State Matriculation examinations. These are positive developments, since prior to 2008, all tertiary institutions applied their own entry criteria, making the transition into higher education less transparent.
Table 2.4. Design and procedural considerations for the State Matriculation examination
Topic |
Specifications |
Notes |
---|---|---|
Testing mode |
Paper-based. Oral and practical examination where relevant. |
|
Testing conditions |
Administered in schools; students sit the examinations in a school in their region but not necessarily the school in which they studied. Examination rooms are under video surveillance. |
Overseen by regional commissions. |
Test subjects |
Compulsory 1. Bulgarian language and literature. 2. Profile subject (compulsory modules only). Optional 1. Student’s free choice. |
For vocational students: Compulsory 1. Bulgarian language and literature. 2. State examination for awarding professional qualifications. |
Item types |
Mixed approach (closed-ended or fixed-response and open-ended). |
For each subject, the item types and their distribution are prescribed in the SES for Profiled Programmes. |
Marking |
Computer-based marking by humans. |
Results are published on line around two weeks after the examinations. The diploma of secondary education specifies a general performance mark. |
Management and leadership |
At the national level: The Center for Assessmet, overseen by the Ministry. At the regional level: REDs, which establish regional commissions for the administration of the examinations. |
The Center for Assessment establishes national commissions: for the preparation of examination tasks in each subject; for assessing the examination tasks; for inspecting the examination papers in each subject; for classification and declassification of examination papers; for electronic processing of the papers. |
Use of results |
Certification of completion of secondary education. Application to higher education (38/52 higher education institutions). Awarding of state scholarships for tertiary studies. |
Vocational students are also issued a certificate of vocational qualifications. Results can be transformed into equivalent marks for the European Credit Transfer and Accumulation System (ECTS) and recorded in the European annexe to the diploma for secondary education. |
Source: Ministry of Education and Science (2016[12]), Наредба No. 11 от 1 Септември 2016 г. за Оценяване на Резултатите от Обучението на Учениците [Ordinance No.11 of 01 September 2016 for the Evaluation of the Results of Student Learning], https://www.lex.bg/en/laws/ldoc/2136905302 (accessed on 18 August 2021).
Safeguards are in place to mitigate potential negative effects of the State Matriculation
There is a risk that high-stakes assessments might distort the education process by narrowing the curriculum and putting an excessive focus on assessed skills (OECD, 2013[14]). It is therefore important to establish safeguards that manage the pressure and attention placed on a particular assessment. For the State Matriculation examination, Bulgaria has several such measures in place. For example, students who do not pass the examination have the opportunity to take the test again an unlimited number of times. The pass mark for all subjects is 30% and few students (6-8%) fail the examinations at the first sitting. In fact, in many subjects, a substantial share of students achieve the highest mark; this is particularly true of foreign languages where over half of the cohort achieve “excellent” scores (Figure 2.3).
While the very low rate of failure in the State Matriculation examination could help minimise the sense of academic pressure students may experience, it is important that results accurately reflect student competencies. At age 15, 47% of students in Bulgaria were considered to have not reached minimum proficiency (Level 2) in reading in PISA 2018 whereas 2 years on, only 8% of students taking the State Matriculation examinations in 2020 failed the examination in Bulgarian language and literature. Although some students will have chosen not to continue into the final stage of upper secondary education, the wide disparity between these shares indicates considerable inconsistencies in how minimum proficiency is defined. Furthermore, awarding an “excellent” to such large shares of students can devalue the examination and render it less illustrative of the differences in students’ abilities. Efforts to mitigate the consequences of this high-stakes test, therefore, need to be more carefully balanced with the examinations’ purpose and design to ensure an accurate reflection of minimum proficiency and sufficient mark distribution among students.
Another safeguard of the State Matriculation examination is that students choose three of the four examination subjects and may even choose to only sit the two compulsory subjects. This level of flexibility allows students to select subjects based on their study interests, personal strengths and any possible requirements for admission into the further education or training pathway of their choice. Nevertheless, while this element of choice can be important in motivating older students to continue with their education and personalise their pathways, it must not be to the detriment of ensuring a minimum common base of core knowledge and skills.
Recent revisions indicate efforts to embed a competency-based approach within examination materials
In the 2021/22 academic year, Bulgaria will implement newly updated curricula for Grade 12, revised to embed a competency-based approach to instruction and new requirements of profiled education (see Chapter 1). Accordingly, the specifications for the State Matriculation examination in each subject have been updated and will be administered starting in May 2022. While some assessed competencies under the new subject specifications are still expressed in terms of what students should know (e.g. “Knows the main processes in the development of the Bulgarian literary language”), the vast majority are expressed through higher-order cognitive verbs that require demonstrating specific skills (e.g. “Evaluates texts according to the success of the communicative goal” and “Analyses and creates written texts, adequate to the communicative situation”). This contrasts significantly with the previous iterations of the State Matriculation examinations’ specifications, which demonstrated learning in much more abstract and general terms (e.g. “Knows the structure and functioning of a work of art” [Bulgarian language and literature]). Although it remains unclear how these changes will be reflected in the design of new test items, the updated specifications signal a shift from knowledge recall to more complex outcomes and higher-order competencies and provide a useful reference from which item writers can ensure the State Matriculation examinations test student competencies in real-world contexts.
National student assessment agencies
The Center for Assessment is responsible for national assessments and examinations
Bulgaria’s Center for Assessment is responsible for developing and approving test material for the NEAs and the State Matriculation examinations, as well as supporting REDs and school management teams to administer the tests. The Center for Assessment also manages Bulgaria’s participation in international assessments and undertakes an analysis of the national and international assessment results. This information is reported periodically to the Ministry to help monitor the quality of schooling. As the Center for Assessment’s mandate has expanded in recent years with the introduction of new national testing instruments (and at additional grade levels), the centre’s responsibilities have outgrown its resources. The number of permanent staff is small (around 20 individuals) and external experts are recruited annually to help design and manage various testing instruments. While this process helps mobilise and strengthen assessment expertise within Bulgaria, it also inhibits the development of institutional memory and expertise within the Center for Assessment. To ensure the range of assessment tools are relevant and sustainable, the Ministry will need to ensure the Center for Assessment has adequate financial and human resources, as well as support to develop the expertise of staff in areas of need, such as psychometrics. Chapter 5 of this report explores this further.
Policy issues
There is a clear political will to improve educational outcomes for all students in Bulgaria. However, despite numerous high-level reforms in recent years, such practices are not yet a reality in many Bulgarian classrooms. Narrow assessment approaches focused on knowledge memorisation are deeply entrenched and a longstanding strong focus on summative scores is hindering the use of more formative practices that have the potential to improve learning outcomes. While the government has taken initial steps to address these issues, for example by introducing diagnostic assessments at the start of the school year, teachers need additional training and support to use these tools effectively and develop their classroom assessment literacy. Bulgaria also needs to review the validity and fairness of the upper secondary education entrance examination, while critically questioning its place in the overall school system in the longer term. Finally, by introducing improvements to the validity of the State Matriculation examination, Bulgaria can take advantage of an opportunity to positively influence learning and assessment in classrooms while also facilitating students’ transitions beyond formal schooling. Together, these efforts are critical if Bulgaria is to achieve its dual goals of enhancing the educational quality and improving outcomes for all students.
Policy issue 2.1. Building a shared understanding of student assessment as a means to support teaching and learning
Bulgaria has a clear intention to modernise pedagogical and other educational approaches within its school system. Nevertheless, extensive reform to policy documentation has not been accompanied by pedagogical innovation or practical changes in student assessment. As a result, student assessment at the classroom and system levels does not align with the type of learning valued in Bulgaria’s new curriculum, diminishing the intended impact of reforms. This is, at least in part, a cultural challenge evident in other countries in Eastern Europe and Central Asia. However, it is also symptomatic of an implementation gap following the 2016 curricular reforms. To fully realise the promise of its educational reforms, Bulgaria needs to communicate the need and rationale for adopting new approaches to assessment, especially in the classroom. At the same time, school leaders and teachers will need support to implement pedagogical changes. Enhancing the link between assessment and learning in a clear and coherent policy framework, as well as providing practical supports for educators to apply in the classroom can help in these regards.
Recommendation 2.1.1. Establish a coherent national vision of student assessment
There are contradictions within Bulgaria’s current evaluation and assessment policy framework that send mixed messages about the role and purpose of student assessments. Ordinance 11 calls for frequent classroom assessment in all subjects with the assignment and reporting of numerical marks. This is not conducive to measuring more complex competencies and does not allow time for impactful feedback loops. Bulgaria’s emphasis on high-stakes, summative assessments may also inhibit the intended changes. For example, the Grade 7 NEA, originally intended as a system monitoring tool, has become the pivotal moment in a child’s education, with strong potential for a negative backwash effect on the curriculum in preceding grades. Recent policy efforts have tried to address some of these challenges by including formative assessment among the criteria covered by the new school inspection criteria, for example. However, there remains a pressing need for a shared national vision of student assessment that is clear and can be applied to real-life teaching and learning situations, as well as to high-level policy processes and communications with stakeholders.
Formulate a high-level national vision of student assessment
Bulgaria needs to clearly establish student assessment as a critical and central part of the learning process in the minds of students, educators and the wider public. Establishing broad consensus around a common vision of assessment that can be upheld across administrations and levels of government will be crucial in achieving deeper and more long-lasting changes in teaching and learning. This shared vision should be formalised in both legislation and accompanying explanatory materials for different audiences to establish a clear reference point for actors across the education system in years to come. Such documentation has proved useful in high-performing education systems as a way of enhancing transparency around national values with regard to assessment practices. In New Zealand, for example, a national vision of assessment has helped ensure that key principles, endorsed by a broad coalition of actors, have informed reform processes for over a decade (Box 2.1).
Box 2.1. Formalising a national vision of assessment in school education in New Zealand
In 2011, New Zealand’s Ministry of Education introduced a Position Paper on Assessment (2011[36]). The paper provides a formal statement of the country’s vision for assessment at the school level. It places assessment firmly at the heart of effective teaching and learning and describes what the assessment landscape should look like if assessment is to be used effectively to promote system-wide improvement within and between all layers of the schooling system. The paper broadly informs and directs policy processes rather than describing in detail how to achieve the ideal assessment landscape. The intention was to promote a shared philosophy among parents, teachers, school leaders, school boards, Ministry of Education and other sector agency personnel, professional learning providers, writers of educational materials and researchers, as well as journalists, commentators and other thought leaders who access, publish and comment on assessment data. As of 2021, it remains in place, having informed and directed policy reviews across multiple administrations.
The paper was informed by a comprehensive expert review of assessment practices in New Zealand and includes a presentation of the context, current assessment practices and approaches and detailed illustration of how assessment can drive learning for the learner, the school and the system as a whole. The key principles highlighted in the paper are: the student is at the centre; the curriculum underpins assessment; building assessment capability is crucial to improvement; an assessment capable system is an accountable system; a range of evidence drawn from multiple sources potentially enables a more accurate response; effective assessment is reliant on quality interactions and relationships.
Source: Nusche, D. et al. (2012[37]), OECD Reviews of Evaluation and Assessment in Education: New Zealand 2011, https://doi.org/10.1787/9789264116917-en; Hipkins, R. and M. Cameron (2018[38]), Trends in Assessment: An Overview of Themes in the Literature, https://www.nzcer.org.nz/system/files/Trends%20in%20assessment%20report.pdf (accessed on 18 August 2021); NZ Ministry of Education (2011[36]), Ministry of Education Position Paper: Assessment [Schooling Sector], Ministry of Education of New Zealand, Wellington.
While existing policy documentation in Bulgaria often focuses on logistical and organisational aspects, the national vision of assessment should adopt a more substantive, evidence-based approach. It should include a clear statement of purpose, providing the rationale for a shift in assessment culture and underlining what the new approach means for pedagogy. Given their absence in other policy documentation, a comprehensive overview of the various components and instruments included in Bulgaria’s national assessment framework, as well as their different purposes, added value and how they work together would also be useful. In this way, developing the shared national vision for assessment can help build a new assessment culture but also align Bulgaria’s broader evaluation and assessment framework for the education sector.
Engage stakeholders in developing the new vision of student assessment
The complexity of 21st century education systems means that a vision imposed from above is unlikely to gain traction and may exacerbate mistrust (Viennet and Pont, 2017[39]). To achieve real change in Bulgaria’s student assessment practices, the full range of education stakeholders will need to be engaged in an evidence-based discussion on the role of assessment and how it can best support learning, as well as establishing practical steps for implementing change. The Ministry should identify key stakeholders (e.g. students and parents, school community, system actors, researchers, non-governmental organisations, media) and facilitate a national conversation by holding a combination of in-person and online workshops and consultations. This will support more efficient use of resources, as well as a more inclusive and timely process that can facilitate real change. For example, in 2015, Ireland introduced the Junior Cycle Profile of Achievement, a new reporting process for student achievement which shifted from a focus on end-of-cycle examination to emphasising ongoing assessment for and of learning, and continuous formative feedback to students. The government held regular consultations with key actors and representatives of the profession were able to voice concerns about the extra workload the changes would bring to educators. As such, the government and the teacher unions established five immutable principles of the reform focused on supporting teachers during the implementation stages (OECD, 2020[40]). Reform implementation became a more collaborative process and has received wider buy-in from the profession.
In Bulgaria, the Directorate for the Content of Pre-School and School Education would be well-placed to oversee these consultations, as this body organised workshops in the past to support Bulgaria’s curricular implementation. The Ministry could also partner with external actors (e.g. a non-governmental or international organisation) to offer some external validation of the process, which may help build consensus. Reviewing good practices nationally and internationally, such as achieving a strategic balance of formative and summative assessment and building assessment capacity among educators and other actors across the system, could help the government ensure the consultation process is informed by evidence. Mapping current assessment practices and regulations would also be important in this regard.
Clarify and better communicate expected learning outcomes to guide student assessment
Many OECD countries have introduced learning outcomes and performance standards to help enhance teaching and assessment practices (OECD, 2013[14]). These define and illustrate in measurable terms what students are expected to master at a certain level of education and can support teachers and other actors responsible for preparing assessment material to develop valid assessment instruments and thus elicit more reliable data about student progress (OECD, 2019[3]). With the move to a competency-based curriculum, Bulgaria introduced expected learning outcomes by subject and grade level. However, perceived curriculum overload, a proliferation of related documentation and a lack of specificity mean that Bulgaria’s expected learning outcomes are not consistently used in classrooms. This should not trigger a rewriting of the expected learning outcomes, as a lot of good work has already been done in developing these across the curricula. However, Bulgaria can strengthen the existing set of expected learning outcomes by making them more coherent, accessible and practical. This can be achieved through the following actions:
Enhance the structure and layout of the outcomes to support clarity. Currently presented as a list and organised according to subject content, teachers in Bulgaria often misinterpret learning outcomes as a checklist of content to cover rather than a means of assessing and improving learning (Ministry of Education and Science, 2019[8]). Presenting the outcomes as part of a learning progression across consistent subject skill areas over an education phase could help address this and may reduce the sense of overload. It could also encourage subject teachers across age groups to collaborate.
Build-in performance standards. Several countries that have well-established learning standards, have broken down expected outcomes into different levels to support teachers in evaluating students’ progress towards mastery. For example, the Assessing Pupils’ Progress initiative (2010) from England in the United Kingdom provided detailed criteria against which judgements could be made about students’ progress in relation to National Curriculum levels (Ofsted, 2012[41]). Teachers were provided with various materials for their subject and age group: a handbook to guide them in implementing the approach; guidelines for assessing pupils’ work in relation to the performance levels; a one-page matrix organising success criteria; and annotated student work that exemplified national standards at each level (Ofsted, 2012[41]). In Bulgaria, defining each performance level in more measurable terms and illustrating these with examples of student work would help equip teachers to apply the expected outcomes in their classroom and help students assess their own progress.
Make expected learning outcomes accessible to students and parents. To encourage students’ self and peer assessment, and foster parental engagement in learning progress, Bulgaria could develop a version of expected learning outcomes accessible to those who are not pedagogical or subject professionals. In England, for example, schools commonly transformed the Assessing Pupils’ Progress criteria into “I can…” statements that were expressed from a student’s point of view. While teachers may find such statements oversimplify success criteria, they can support learners, particularly younger ones, to better understand what is expected of them.
Ensure alignment and coherence with wider evaluation and assessment practices
Aligning other components of Bulgaria’s evaluation and assessment framework with the national vision of student assessment will help implement the vision and reduce inconsistencies in practice. Previous OECD analysis of education policy processes has found that proactively aligning policies at different levels of the system (e.g. institution, local or system levels) can facilitate stakeholder buy-in, capacity building and greater clarity in terms of progress (OECD, 2019[42]). Bulgaria’s national vision of assessment should not therefore only inform approaches to student assessment but also underpin broader evaluation and assessment efforts in the following areas:
School evaluation: The national vision should trigger updates to Bulgaria’s school quality criteria (see Chapter 4). Including these in school evaluation rubrics could encourage schools to build their assessment capacity in line with the philosophy set out in the national vision.
Teacher development and appraisal: Bulgaria will need to review the professional profile for teachers to ensure that standards related to assessment align with the national vision (see Chapter 3). Promoting a new assessment culture through initial teacher education and continuous professional development, as well as through the attestation and other appraisal processes could further incentivise adherence to the new vision of student assessment.
System evaluation: The design, purpose and use of the NEAs, as well as other external assessments, will also need to be considered in developing Bulgaria’s new vision of assessment (see Chapter 5).
Communicate the vision in a strategic way to build trust and support for change
Once achieved, Bulgaria will need to find ways to ensure that the national vision remains a “living” document for actors across the system. One way to do this is to establish a website dedicated to the national vision of student assessment. For example, when introducing the Project for Autonomy and Curricular Flexibility in 2017 to support the implementation of a new curriculum, Portugal’s Ministry of Education established a website as a digital resource for reflection and the sharing of practices, as well as a digital library for reference documentation to support teachers in their curricular and pedagogical decisions (Portuguese Ministry of Education, 2021[43]). Four years on, the website continues to grow and to document and support the implementation of the project and the curriculum reform. The site includes official legislative and other documentation relating to the reform, examples of good practice from across the country, access to webinars and presentations to support implementation and regularly updated news and events.
In Bulgaria, this website or digital platform could initially document the national conversation, with news about upcoming online and in-person events, summary records of meetings, consultation exercises and expert reviews. Once developed, the vision and any associated strategies or action plans can be presented on the platform. This would also be a suitable place to house digital versions of expected learning outcomes and support materials. Over time, the website can become a one-stop-shop for student assessment in Bulgaria, with content aimed at teachers, students, parents and the wider public. Several other recommendations in this chapter suggest specific ways to use this platform.
Recommendation 2.1.2. Adapt the reporting of student learning information to promote a broader understanding of assessment
As in other countries, Bulgaria faces the challenge of balancing the tensions between stated commitments to broader forms of assessment on the one hand and public, parental and political pressure for accountability in the form of scores and rankings on the other. While attention to results and data is a positive feature of education systems, an overemphasis on these may have a negative impact and undermine the formative role of assessment (OECD, 2013[14]). Changing specific marking and reporting practices will therefore be important in making the national vision of student assessment a reality in classroom practice. Other OECD countries where summative scoring has tended to weigh heavily, such as France, have found revisions to student reports and marking to be a particularly effective way to communicate and embed new expectations
Make classroom and school-level marking practices more conducive to student learning
Marking plays a central role in the work of effective teachers. It can provide important feedback to students and help teachers identify possible misunderstandings (Elliott et al., 2016[17]). Currently, teachers in Bulgaria are encouraged to formally mark students’ work regularly and to do so in a timely manner. However, marking is time-consuming and can contribute significantly to the non-teaching workload of teachers. Furthermore, research indicates that overemphasising numerical marks, as in Bulgaria, can also discourage learners’ effort and motivation if the information hurts self-confidence or conveys to the student that return on effort is low (OECD, 2013[14]). Furthermore, it does not facilitate progress as students are not supported in understanding their current level in concrete terms or what to do to improve.
Therefore, it is important that Bulgaria’s policy efforts around classroom and school-level marking processes work to strike a balance between effectiveness, in terms of impact on student learning and efficiency in terms of use of teachers’ time. This can be achieved by:
Reducing the required frequency of continuous assessments. Across all grades, reducing the frequency with which teachers are required to formally award, report and log qualitative and quantitative descriptors will give teachers more time for deeper marking, meaning they can better articulate to students what they can already do well and what they need to improve. It will also create space within the curriculum and learning time for that marking to be fed back to students in meaningful ways so they can engage with their results and work with teachers to act upon them.
Reframing qualitative descriptors to better promote progress. The current labels used for qualitative descriptors in Bulgaria offer a summative judgement of student achievement in the specific assessment. Reframing these labels as signposts within a progression towards mastery of a competency or skill would better position assessment as part of the learning process. For example, instead of excellent, very good, good, intermediate or poor, Bulgaria’s qualitative descriptors could be expressed as exemplary, accomplished, developing, emerging and undeveloped, as such language can be more motivational for low-performing students.
Requiring descriptive feedback. Written feedback, including corrective feedback, is highly effective for enhancing the learning of new skills and tasks (Wisniewski, Zierer and Hattie, 2020[44]). However, it is also labour intensive for teachers and there are ways to provide more detailed descriptive feedback without requiring written evidence, as well as ways to facilitate these more constructive feedback processes. For assessment to have a greater impact on learning, Bulgaria should require teachers to regularly provide descriptive feedback to students beyond the qualitative descriptor. This should be individual written feedback at least once a semester; on other occasions, it could be more feasibly conducted as oral feedback, either individually or in small groups, class feedback that targets a specific common problem across the student group or more granular marking through which teachers direct students’ attention to errors through more detailed marking approaches but without elaborating on these in written comments. Descriptive feedback can also be facilitated by enhancements to the expected learning outcomes, as described above, which provides teachers, students and parents with a common language. Furthermore, reporting templates (see below), could facilitate this type of formative feedback by requiring teachers and/or students to identify what has gone well in a specific assessment and what could be better in the future.
Strengthen reporting to help students and parents understand broader progress
Internationally, many education systems explicitly prescribe record-keeping and reporting procedures for student assessment (Li et al., 2019[1]). This often goes beyond logistical requirements such as timing and includes more substantive guidance such as providing common report card templates (Box 2.2). In Bulgaria, besides some information in Ordinance 11 regarding the timing of reports to students, recording and reporting student progress is at the discretion of schools or teachers, which can lead to inconsistencies in practice. To make reporting more conducive to student progress, the Ministry should develop a national report card template that makes space for descriptive and formative feedback, as well as summative scores. By requiring students to input their own learning targets and to log reflections on the teachers’ comments about their progress, the report cards could also support students in driving their own learning.
The Ministry should also develop guidance materials to explain how teachers, students and parents should use these report cards. Such actions could help facilitate more impactful classroom assessment practices while imposing a standardised procedure that reduces external pressure on teachers to focus on numerical marks. In particular, the Ministry should provide guidance on how to report feedback to parents, per the requirements set out in Ordinance 11. This can be done by sharing best practices for improving communication between teachers and parents (e.g. phone calls, email, videoconference, in-person and the circumstances under which each mode is most pertinent, as well as the frequency of communications). This guidance could be located on the digital platform for assessment (see Recommendation 2.1.1).
Box 2.2. Enhancing the recording and reporting of student assessment data in Denmark
Since 2006, all primary and lower secondary schools in Denmark must provide Individual Mandatory Student Plans (IMSPs) tracking student progress. These include a summary of students’ results and qualitative feedback on how these will be followed up. For national assessments, formative comments on student performance are included but not marks. The IMSPs are not a simple report card or performance tracker but rather a working tool for teachers, forming the basis of discussions between students and teachers, as well as with parents. They also provide a record of student achievement throughout compulsory education, easing transitions between grades. Denmark’s IMSPs continue to evolve, including conversion to digital format to make them more accessible to students, parents and teachers. The digital platform enables teachers to collate information on progress, goals and student assessments, as well as recording the specific goals for the individual student, a progress status in relation to the goals and a monitoring section describing how and when to follow up.
Source: Shewbridge, C. et al. (2011[45]), OECD Reviews of Evaluation and Assessment in Education: Denmark 2011, https://doi.org/10.1787/9789264116597-en; OECD (2020[46]), Education Policy Outlook: Denmark, http://www.oecd.org/education/policy-outlook/country-profile-Denmark-2020.pdf (accessed on 18 August 2021).
Policy issue 2.2. Developing the capacity of teachers to use formative assessment
Research has shown that the application of formative approaches to assessment can contribute to substantial achievement gains (Black and Wiliam, 1998[47]). They can be particularly effective for lower-achieving students, thus helping to reduce inequities in learning outcomes and raising overall achievement (OECD, 2013[14]). Formative assessment will also be critical in learning recovery following disruptions to schooling in 2020 and 2021 during the COVID-19 pandemic (OECD, 2020[19]). In Bulgaria, where large shares of students do not master basic skills and where learning gaps and disengagement start young, embedding formative assessment practices in the classroom has the potential to have a considerable positive impact on learning for all students. While formative assessment is generally underdeveloped in Bulgaria, these practices can be built upon the country’s start-of-year diagnostic tests and regular classroom assessments.
Many school-based actors in Bulgaria already aim to make assessment more meaningful and motivational for students. However, as in many OECD countries, formative approaches are commonly misunderstood as “summative assessment done more often” or as practice for a final summative assessment (OECD, 2013[14]). This is partly related to Bulgaria’s assessment culture but also the high visibility of standardised assessments, which puts pressure on teachers to adapt their own assessment practices to mimic the format of national assessments. There is therefore considerable opportunity for Bulgaria to clarify teachers’ understanding of formative assessment, develop their related skills and provide them with practical supports to implement more formative assessments in the classroom.
Recommendation 2.2.1. Promote the use of diagnostic assessments to help teachers better understand and adapt to the learning needs of students
Diagnostic assessment is a type of formative assessment that helps establish a student’s starting point for learning, identify students at risk of failure or disengagement, and plan for an appropriate and more personalised intervention (OECD, 2013[14]). In Bulgaria, however, inconsistencies between system-level outcomes in national and international assessments among older students indicate that gaps in learning develop early in their schooling and go undetected or unresolved as they progress through the system. While the introduction of mandatory start-of-year diagnostic testing is a very positive initiative, this has become a missed opportunity for Bulgarian educators to improve learning outcomes because the diagnostic tests are not consistently deployed effectively in the classroom. This is in large part because of weak formative assessment literacy among teachers and the lack of guidance and support they receive on how to design, administer and use diagnostic assessments.
Prioritise younger students and core subjects to have a greater impact in the long term
To make the most of the mandatory start-of-year diagnostic assessments, Bulgaria needs to introduce a national programme to enhance the quality of their design and use. Addressing any training or other support initiatives at the entire teacher cohort from the start is likely to diminish the impact. Therefore, Bulgaria should identify key subjects and grade levels for which to prioritise more targeted efforts to enhance diagnostic assessments. Following a pilot period of experimentation and exploration among this more targeted group of professionals, Bulgaria can adopt a staggered implementation approach to reach the full cohort of teachers. This will allow the Ministry to identify components of good practice before investing significant resources in scaling them.
Given that diagnostic assessment is a key component of identifying learning needs and informing early intervention approaches and that the earlier learning gaps are identified and addressed, the more impact remediation can have, Bulgaria should prioritise enhancing the quality of diagnostic assessments for the youngest learners first. In terms of subject areas, PISA 2018 results indicate that proficiency in reading, mathematics and science among Bulgarian 15-year-old students is well below the OECD average. National assessments and examinations suggest that mathematics performance is particularly low. At the same time, through the specialist mathematics schools, Bulgaria has a pool of specialist subject teachers that could collaborate with mathematics teachers in non-specialist schools to share their expertise. Bulgaria might therefore focus initial efforts to enhance the administration and use of diagnostic assessments on the teaching of mathematics in the early years of primary education.
Support teachers to make full use of start-of-year diagnostic assessments
A national programme to enhance the quality of diagnostic assessments will need to tackle both issues of assessment design and how results are used by teachers. In particular, such a programme will need to consider the limitations of the Center for Assessment’s capacity to centrally design diagnostic tests, as well as teachers’ time, motivation and capacity to engage in associated training or to experiment with new resources. It should focus in particular on building teachers’ understanding of the formative purpose of such assessments and how they can be used to change the classroom conversation on learning from one of summative judgement to a collaborative effort by teachers and students to develop core competencies using assessment evidence as a guide. Without measures to address these factors, the assessment data generated from the diagnostic tests will have little impact on classroom assessment, pedagogy and learning. To address this, Bulgaria should take the following actions:
Introduce clear and tailored reporting requirements for diagnostic assessments. Bulgaria will need to provide an incentive or accountability mechanism to ensure that the start-of-year diagnostic assessments inform teaching over the longer term. Requiring teachers to share qualitative feedback from the diagnostic assessments in students’ report cards (see Recommendation 2.1.2) at the beginning of the school year can facilitate this and provide a reference point for the student and teacher to monitor progress and design an individualised learning plan. Critically, this reporting should not include a numerical mark but rather focus on descriptive feedback, identifying what the student can already do and what knowledge or skills need strengthening. Clear expected learning outcomes, broken down into progress levels (see Recommendation 2.1.1) would also support teachers in this process.
Develop a central database of diagnostic assessment tools for teachers. Bulgaria’s diagnostic assessments need to provide fine-grained information that allows teachers to uncover specific strengths and difficulties of individual students in relation to the curriculum. Developing such assessments is a labour-intensive process and requires a high level of expertise. Currently, teachers in Bulgaria, as in many other education systems, have neither the time nor the skills to do this. Education systems, including Estonia, France and Romania, have found it more efficient and effective to provide teachers with centrally developed diagnostic assessments and related tools. In Estonia specifically, diagnostic tools are digital, which facilitates both administration and results analysis (OECD, 2019[42]). In France and Romania, the assessments are standardised nationally for key grade levels. Initially, Bulgaria’s Center for Assessment may lack the internal expertise and resources to achieve either of these approaches so external actors may be called upon for support. This could include private assessment design companies, academic researchers within higher education institutions, non-governmental education organisations, such as Teach for Bulgaria, or international organisations.
Establish supports for administering and using diagnostic assessments. Teachers in Bulgaria would benefit from guidelines for best practice and modelled examples of successful application and use of diagnostic assessment in classrooms. In Estonia, for example, each diagnostic assessment tool is accompanied by a series of e-tasks that enable teachers to easily individualise teaching and learning and group students for different activities based on their performance in the tests (Innove, n.d.[48]). In Chile, diagnostic assessments introduced for the return to in-person learning following COVID-19 school closures were accompanied by a video mentoring programme for school management teams through which experts from the national administration worked with school staff to identify their main needs related to the assessments, explain and explore specific tools and guidance that could address these needs and then analyse and evaluate their experiences (OECD, 2020[19]). In Bulgaria, REDs could perform a similar role, working with groups of teachers to plan, implement and analyse diagnostic assessments in their classrooms. In addition, participants in the pilot programme for primary level mathematics teachers (as recommended above) can help build a bank of useful resources and key guidelines for future participants based on their own experiences. The guidelines and supports can then be collated on the digital platform for assessment (see Recommendation 2.1.1).
Provide teachers with the time and space to engage with results from diagnostic assessments. In a context where the curriculum is already perceived to be overloaded, it is important that Bulgarian teachers feel they have the time and space to adapt their teaching in response to their students’ needs, as determined through the start-of-year diagnostic tests. This chapter has already recommended reducing the required frequency of formal classroom assessment (see Recommendation 2.1.2). However, to further support the use of diagnostic assessments, Bulgaria could also consider introducing an assessment-free period in the first half of the first semester to allow teachers and students to engage and respond with the results of the diagnostic assessments before having to prepare for the next assessment.
Recommendation 2.2.2. Foster real change at the classroom-level by making training on formative assessment a priority for all teachers
School change scholars suggest that unless teachers and school leaders understand and share the policy meaning, it is unlikely to get implemented (Viennet and Pont, 2017[39]). Embedding formative assessment in classroom practice requires changing schools and teachers’ practices, their beliefs and the pedagogical materials they design and use. Therefore, as in other countries in the region, encouraging greater use of formative assessment in Bulgaria will require concerted efforts, not only to develop teachers’ assessment literacy but also to build an understanding of why it matters (Kitchen et al., 2017[2]; Maghnouj et al., 2020[25]). Just as teachers require additional support related to curriculum content and subject knowledge following curricular reform, they also require guidance and training related to specific pedagogical components, including assessment practice. Although training opportunities have been made available to support curriculum implementation in Bulgaria, there is no evidence that these have explicitly covered new approaches to assessment. This gap must be addressed to align the intended, implemented, assessed and learned curricula in Bulgaria.
Strengthen the development of formative assessment practices in initial teacher education (ITE)
Research indicates that if teachers do not learn to meaningfully apply formative assessment practices during their initial education, this will limit their ability to apply formative assessment throughout their career (Earl, 2007[49]). Meaningful application requires a strong understanding of the concepts and theories behind formative assessment but also a practical experience of using formative assessment in the classroom. In Bulgaria, assessment practices are currently treated as a transversal component of ITE and it is rare, if not unheard of, for teacher candidates to engage in programme modules explicitly dedicated to formative assessment. In some respects, it is positive that assessment practices are not dismantled from subject knowledge and pedagogy. However, new teachers in Bulgaria could benefit from more explicit instruction about formative assessment, especially since they are graduates of a school system that did not promote such approaches. Without addressing this issue in ITE, Bulgaria may replicate existing assessment practices rather than implement new ones. To ensure that formative assessment is a prominent feature of ITE, Bulgaria could:
Identify key components of formative assessment practices to be included in ITE programmes. The Ministry should prioritise the formative practices that it expects all teachers across Bulgaria to master. This could include elements identified in academic literature as particularly effective (e.g. questioning, feedback, and self and peer assessment) (Black and Wiliam, 2018[16]) as well as elements more specific to Bulgaria’s assessment framework (e.g. diagnostic testing). These components should then form the basis of curricular guidelines or requirements relating to formative assessment for ITE providers.
Include formative assessment in practical components of ITE. In many education systems, and certainly in Bulgaria, ITE programmes tend to rely on a knowledge-based and didactic approach to preparing teachers rather than an applied, competency-based approach. To develop formative assessment practices more effectively, programme providers and trainers will need to work closely with school leaders and mentors to align their understanding of the key components of formative assessment and identify expectations for teacher candidates’ application of these components in the classroom. In particular, any school or university-based mentors working with trainee teachers during their practicum will need their own training and guidance on supporting the development of formative practices.
Establish incentives and accountability processes to motivate ITE providers and beginning teachers to embed formative assessment practices. Having identified the key components of formative assessment to be included in ITE programmes and appropriate ways in which these can be applied to the teaching practicum, the Ministry should embed these within programme accreditation standards for providers. This would help incentivise providers to adhere to the new guidance more closely. At the same time, the components should also be mirrored within the professional profile for beginning teachers (see Chapter 3); in this way trainee and novice teachers, as well as their mentors and tutors, will be more motivated to develop associated skills.
Ensure that teachers have access to quality continuous professional development on formative assessment
To reach the wider cohort of teachers, beyond those in the earliest stages of their career, promoting quality professional development on formative assessment will be crucial. This implies instructing teacher education providers to develop programmes related to strengthening formative assessment practices and encouraging teachers to participate in this training. As many teachers, particularly older ones, may be more likely to hold more traditional views of assessment practices and may be more reluctant to take up new approaches, the new training opportunities will need to be of very high quality and have a wide reach. Such training should be based on evidence of what makes professional development impactful, such as active learning, school-embedded training and a sustained duration (OECD, 2020[19]). In Sweden, for example, pedagogical training within schools have had a positive impact by creating a space for teachers to independently plan a teaching sequence using formative assessment tools, discuss the plan with colleagues, then teach the lessons and evaluate their experience (OECD, 2019[42]). In Bulgaria, this type of initiative could be meaningfully facilitated through the REDs, which now have a role to provide methodological support to teachers. In order for this to occur, however, staff within the REDs will need their own training and support mechanisms (see below). Designing the programme in collaboration with higher education institutions could also be a way of enhancing collaborative relationships between teachers, schools and higher education staff.
Changing practices across all teachers will take time and progress will be asymmetrical with some teachers more open to change than others. These asymmetries can be used in a positive way. Those teachers who engage with formative assessment practices more proactively from the start can play a role in supporting other teachers in their schools to implement change, through mentoring or coaching schemes, for example, running in-school teacher-led training or simply through sharing good practice. In this way, formal professional development opportunities can be complemented by school-based peer support. However, for this to be a sustainable approach to stimulating wider change, these teachers would need formal recognition of their role and supportive school-level structures. For example, the role could be recognised within the appraisal process or through professional development credits (see Chapter 3).
Recommendation 2.2.3. Equip teachers with a range of practical support to facilitate formative assessment in the classroom
While training and learning opportunities are important in redressing teachers’ misconceptions of formative assessment and establishing a baseline of related knowledge and skills, supporting teachers in integrating formative assessment in their classroom practice will require ongoing support and resources that are grounded in or are easily transferable to their own practice. Specifically, teachers need access to practical tools that can be adapted to meet the needs of their students and methodological support at the local level. Bulgaria can support teachers in this way by developing online support, such as videos, rubrics and templates, and enhancing the in-person support offered by the REDs.
Provide teachers with resources to support formative assessment practices
Over the last 20 years, a wealth of research has been undertaken on formative assessment and practices have been applied and tested. Bulgaria can benefit from this body of knowledge and resources by making it available to teachers in easily accessible and digestible formats. The online assessment platform (see Recommendation 2.1.1) would make an ideal home for these tools. A similar resource has been developed by the Australian Institute for Teaching and School Leadership (AITSL) which collates information about assessment and feedback as well as other classroom practices, in a digital library for teachers. The library includes documents with a range of formats from evidence summaries to videos of classroom practice and assessment tools and guides (AITSL, n.d.[50]). Bulgaria should consider gradually building a library of guidance materials as well as videos modelling feedback processes in the classroom, student report card templates, marked examples of students’ work, rubrics for assessing students learning against expected learning outcomes and video tutorials on key aspects of formative assessment.
Build capacity at the regional level to support teachers’ formative assessment
Analysis of impactful policy processes related to teacher development has revealed that designing initiatives that address needs at a local level can be particularly impactful (OECD, 2020[19]). In Bulgaria, REDs should take responsibility for supporting teachers with formative assessment in the classroom. By assigning them this role, the Ministry would be better positioned to identify clear outcomes against which to monitor the performance of staff in REDs. For instance, similar to the Norwegian model (Box 2.3), the Bulgarian Ministry could identify some basic guidelines for REDs to support teachers with formative assessment while still allowing the REDs to develop their own programmes. The Ministry could monitor progress by collecting feedback from teachers and school leaders about the kinds of support they have as well as their understanding and application of formative assessment practices.
Box 2.3. Supporting schools to strengthen assessment through regional initiatives in Norway
Norway’s Assessment for Learning programme (2010-18) was developed to support schools, training providers and local authorities to improve formative assessment practices in classrooms across the country. The Directorate for Education and Training set guiding principles for the content and organisation of the programme, while local authorities were charged with local-level implementation. Around 90% of municipalities were involved in the programme across two phases.
The programme was based on four principles for quality formative assessment, outlined in Norway’s Education Act. These are that students learn better when they: i) understand what to learn and what is expected of them; ii) obtain feedback that provides information on the quality of their work or performance; iii) are given advice on how to improve; and iv) are involved in driving their own learning process and self-assessment. In order to help implement the programme, a range of core documents are provided by the directorate to municipalities. This includes: a base document describing the aims of the programme, common guidelines, roles and responsibilities for all participants; planning, self-evaluation and reporting templates for schools; and a pupil survey producing results at the national, school owner and school levels. The directorate also organised seminars and conferences for participating local authorities and provided online training and resources for schools.
Final evaluations found that participation had led to a more learning-driven assessment culture, increased use of formative assessment practices, improved curriculum planning and improved research and development culture among schools. The learning networks among participating schools aided the exchange of knowledge and provided peer support for implementation. However, the scope of change varied, indicating that some participants needed more time to bring about significant change.
Source: OECD (2020[51]), Education Policy Outlook: Norway, https://www.oecd.org/education/policy-outlook/country-profile-Norway-2020.pdf (accessed on 18 August 2021); Hopfenbeck, T. et al. (2013[52]), "Balancing Trust and Accountability? The Assessment for Learning Programme in Norway: A Governing Complex Education Systems Case Study", https://doi.org/10.1787/5k3txnpqlsnn-en.
Policy issue 2.3. Enhancing the validity and fairness of examination and selection processes into and out of upper secondary education
Bulgaria provides multiple pathways as students move into upper secondary education, all of which enable access to tertiary level. In principle, this encourages students to think about their futures as they progress through school, selecting study programmes that are well-suited to their ambitions and providing opportunities to change pathways if that ambition changes. In reality, students in Bulgaria rarely change pathways and the Grade 7 NEA, which was initially implemented as a system monitoring tool, plays an outsized role in determining students’ educational destinies. Moreover, student selection occurs markedly earlier (around age 13) in Bulgaria than in most countries across Europe and the OECD (around age 16), exacerbating challenges to system quality and equity. Using an examination to help sort students into different schools can help improve fairness by ensuring that tracking decisions are not determined solely by teacher judgements and reducing the scope for manipulation. However, there are elements of Bulgaria’s Grade 7 NEA, such as the lack of safeguards to mitigate the adverse effects of high-stakes testing and a negative backwash on curriculum, that distort both learning and the selection process. To improve student transitions into upper secondary education, Bulgaria should design an examination and student allocation process which is better matched to the purpose of selection.
Compared to the NEA, Bulgaria’s State Matriculation examination, which students take at the end of upper secondary education, is perceived as a more valuable tool from the perspective of facilitating student transitions. The integrity and reputation of this examination have increased in recent years thanks to its secure development, administration and marking procedures. As a result, the vast majority of eligible students now opt to sit the State Matriculation examination and a growing number of higher education institutions accept its results as part of their admissions criteria. The State Matriculation examination has also facilitated a certification process for Bulgaria’s upper secondary education system by standardising, at least to some extent, the transition into tertiary education. Safeguards, such as subject choice and opportunities to resit are also in place to alleviate potential negative effects on student outcomes. However, there is scope to align the State Matriculation examination more closely with the subject areas covered in Bulgaria’s national curriculum and with broader development goals, given, for example, that very few students choose to take the examination in high-demand STEM subjects.
Recommendation 2.3.1. Reform the selection process into upper secondary education to increase equity and facilitate quality learning in Grade 7
While in the longer term, Bulgaria may want to rethink the value of having a selection examination at age 13 within the context of a broader reflection on the structure of school cycles and programmes, in the immediate term there is a need for reliable, external input at the transition point between lower and upper secondary education, in particular for those students applying to the most in-demand schools. This will imply rethinking the Grade 7 examination to improve its validity from the perspective of selection and notably making a much clearer distinction between the NEA and a Grade 7 to 8 selection examination, and revising the content of the tests to better reflect the curriculum. This also means reviewing how the test outcomes are used in the selection process, including whether the test is mandatory or optional, enhancing the reliability of data by removing classroom marks and providing additional information and support to students to inform their choice of programme and school.
Introduce a new standardised examination specifically for the selection process
Academic research on assessment design warns of the risks associated with using a single test for multiple purposes, particularly in situations where the information required from the assessment for each purpose is not the same (Morris, 2011[53]). In instances where multiple purposes are present, the main purpose of the assessment should be clearly stated and mutually recognised by all stakeholders (OECD, 2013[14]). This is not the case with the Grade 7 NEA which, for system actors, has the primary purpose of providing information about system performance, but for teachers, parents and students is a vehicle for school and pathway selection for upper secondary education. To decouple the system monitoring aims from the selection process, Bulgaria should take the following steps:
Remove the selective function from the NEA. By removing performance in the NEA in Grade 7 from the selection process for students’ transitions into upper secondary school, Bulgaria can focus on developing and strengthening a low-stakes, more formative national assessment as a tool for evaluating and improving system performance. Chapter 5 provides more detailed recommendations for this process.
Introduce a new selection examination that is more fit for purpose. Once the national assessment and the selection examination have been decoupled, Bulgaria can focus on developing a new selection examination specifically designed to inform admission to upper secondary school which is more fit for purpose. In particular, Bulgaria should clarify the purpose of the selection process in general: is it a way to identify the highest-performing students in academic terms, for allocation into the highest-performing schools, or is it a means by which students can be matched to the pathways and schools most suited to their abilities and ambitions? Estimates indicate that only the top 5% of academic performers can be fairly identified through a selective academic examination alone (Vernon, 2017[54]). Therefore, in the shorter term at least, while the selection process remains compulsory for all students, the focus of the selection process should be the latter – matching students to the most appropriate pathways and schools for them.
Clearly communicate this change to the wider public. Developing a new examination specifically designed for the selection process is an opportunity to reduce the perceived stakes of the examination as, in the longer run, the participation of all students will cease to be a requirement. However, to realise this opportunity, the Ministry must clearly communicate to all stakeholders that the new examination and selection process is not a reference point for judging system or school quality. This communication effort will need to go hand in hand with similar efforts regarding the reformed national assessment for Grade 7. Ultimately, the most powerful signal will be to move that national assessment to Grade 6, as suggested in Chapter 5. Prior to that, virtual conferences with educators and parents, as well as wider communications efforts through national media will be important.
Include relevant stakeholders in initial development discussions. This includes representatives from different types of upper secondary schools, covering the full range of pathways available, as well as representatives from the lower secondary schools. In addition, actors from the REDs and the Center for Assessment should be involved. Actively involving these actors in the development and design of the new selection process, at least initially, could help build consensus around the changes and trust in the new examination system.
Maintain current responsibilities for development and administration. Although the purpose and design of the new examination will be reformed, the structures currently in place for managing the examination and selection should be maintained. This means that the Center for Assessment should lead the development and administration of the new examination and REDs should continue to oversee the online selection process. Ensuring continuity in management and development could help ease the transition to a new examination and selection process. These bodies have already developed the required capacity and well-trusted processes; leaving them in their roles could help facilitate buy-in for reforms.
Design a selection examination that assesses a broader set of competencies to better inform selection into different pathways
As the aim of the Grade 7 selection process should be to best match students to upper secondary pathways and institutions, the selection examination must assess broader competencies. Currently, the Grade 7 NEA only considers student performance in Bulgarian language and literature, and mathematics. A new examination that assesses a wider set of competencies could offer more tailored information on the specific profiled pathway students might be best suited to, or the appropriateness of VET or general education. There are different possible approaches to achieving this, such as designing different examinations for different pathways or programme types with students being entered for the examinations corresponding to their chosen pathways or a single examination which generates information on a broader set of skills which are then matched against the demands of the various pathways in secondary education. The Netherlands uses a well-respected examination of the second type to inform student transitions to secondary education (Box 2.4).
Box 2.4. Designing tests to inform student transitions to secondary school in the Netherlands
In the Netherlands, all students take an extended learning achievement test at the end of primary education, which helps provide information on the most suitable type of secondary school for them. The majority of students (85%) take a test designed by the Central Institute for Test Development (CITO). The CITO test has a multiple-choice format covering various subjects (Dutch language, mathematics and study skills) as well as an optional subject, world orientation (geography, history, biology). The results provide information on students’ mastery of key skills across these subject areas; as such, the final score does not only provide information about the student’s learning achievement but can also indirectly indicate aspects of intelligence, motivation, concentration and drive to learn. Students’ scores are sorted into three score bands. Through extensive research conducted by CITO, based on these score bands, each student receives an individualised report advising suitability for each of the available pathways. Notably, none of the bands indicate the immediate rejection of a student from a specific pathway. Rather, the advice is to seek more extensive research at the school level, with the student and parents, as well as their teachers. In 2014, the OECD reported that these tests are recognised as having excellent psychometric properties, are highly reliable and are well-respected within the Dutch education system. CITO’s research indicates that secondary schools most commonly use students’ scores as a second opinion to complement the primary school’s recommendation and as a good predictor of student suitability rather than as a formal prescription of the type of school a child must enter.
Source: Nusche, D. et al. (2014[15]), OECD Reviews of Evaluation and Assessment in Education: Netherlands 2014, https://doi.org/10.1787/9789264211940-en; van der Lubbe, M. (n.d.[55]) The End of Primary School Test, Central Institute for Test Development, Amsterdam.
In Bulgaria, a continued focus on strengthening students’ core skills is required in order to increase the share of students reaching at least minimum proficiency in literacy and numeracy in particular. At the same time, the sheer number of available pathways and school types makes the development of separate examinations more challenging. Therefore, one way forward could be to combine an assessment of core skills with an evaluation of broader competencies. In this way, the new selection examination could be designed in two parts:
Part one could assess students’ current ability and readiness for upper secondary education by determining a basic level of core skills in key subjects. Initially, this would be Bulgarian language and literature, and mathematics, however, given the stated focus on promoting STEM skills in the period 2021-30, Bulgaria may also consider introducing examination items on scientific knowledge. This part of the examination may include multiple-choice or non-complex constructed response items suitable for objective scoring.
Part two could assess students’ aptitude for different pathways through more complex items which aim to assess subject-specific higher-order skills and a range of key transversal competencies. Similar to the Dutch example, and in line with the move towards competency-based education, the focus of the examination should be on key skills in these areas as opposed to testing subject-based knowledge. In this way, the examination can provide information about a student’s aptitude for a certain type of skill. This will require items that are more focused on the application of learned knowledge in real-world settings. These skills should be mapped against the skills’ demands of the prospective secondary pathways; Thus, more granular information about a student’s performance on the examination in the specific skill areas can provide useful information about the pathway they are suited to. By broadening the scope of assessed skills, Bulgaria can also help to reduce the negative effect the current examination has in narrowing the focus of the curriculum.
The relative weighting of marks across the two parts should be decided in collaboration with relevant representatives from a range of upper secondary schools so that decisions reflect schools’ priorities and encourage stakeholder support for the changes. However, the higher weighting, or larger share of marks, should be awarded in part two to ensure that higher-order skills are given due attention. While the examination scores will be on a single, national scale – an important requirement for transparency and comparability – the minimum number of marks required for entry into each school could still be determined at the regional level, similar to the current system.
In the short term, enable students to opt in to the selection process
As well as enhancing the Grade 7 selection examination to make it more fit for purpose, tackling the challenges of Bulgaria’s early tracking system will likely require larger-scale reforms to improve the transition between Grades 7 to 8. However, given the intense public attention on this moment in children’s educational journeys, implementation of any reforms in this area poses considerable challenges and are likely to face resistance. Therefore, a phased approach may be required with smaller more urgent changes to enhance the fairness of the current process in the short term and larger-scale changes that can help gradually reduce the role and impact of selection on children’s futures in the longer term.
Prior to 2010, the selection process into upper secondary schooling was optional and only applied to students wanting to attend certain schools. These schools were typically the highest-performing gymnasia, or specialist schools for foreign languages or mathematics. Around 40% of students opted into the selection process and other students enrolled in local schools or VET programmes. Once the selection process was coupled with the Grade 7 NEA, a census-based approach was adopted meaning that all students must participate in the selection process, even if they do not apply for competitive places. Not only does this increase the administrative burden of the selection process but it also exposes many students to undue academic pressure and worsens the negative backwash effect on the curriculum.
Bulgaria should make the new selective examination optional. Turkey, which until recently also had a compulsory selection process in place, provides a good international example of moving to an optional process (Box 2.5). At first, the majority of students will likely continue to opt in to the selection process. However, some students will be happy to be automatically allocated to their local upper secondary school, which could immediately help to address some inequities caused by academic segregation and reduce some of the risk of school dropout. To facilitate this process, based on historical oversubscription trends, REDs will need to work with schools in their region to determine which schools can offer places and how many, to students not participating in the selection process. Schools for which admission has been historically competitive will not be able to accept students that do not participate in the new selection process. At each stage of the online ranking process, students should be allowed to retire from the competitive process and opt to be automatically allocated to their local upper secondary school.
Box 2.5. Reducing the role of selective admissions for upper secondary education in Turkey
Turkey has recently reformed upper secondary school placement procedures to help address inequities created by early tracking. The previous mandatory Transition from Elementary Schools to Secondary Schools Examination (Temel Eğitimden Ortaöğretime Geçiş Sistemi, TEOG) required students to rank their school preferences. They were then allocated one of their preferred schools based on results in a centralised examination (70% weighting) and average scores in lower secondary classroom assessments (30% weighting). While the TEOG was considered a fair and transparent examination, it also created a high level of competition and excessive pressure on learners, as well as narrowing the curriculum and promoting private tutoring that takes place out of school.
In response to these criticisms, the government abolished the TEOG in 2017 and announced a new system based on catchment areas, students’ interests and overall achievement in lower secondary. Around 10% of school places in the top schools are still determined by an optional centralised examination. In 2018/19, about 85% of the cohort chose to take this examination, which determined around 13% of places. However, Turkey expects candidate numbers to fall as families and schools become familiar with the new system. While the reform’s intentions are positive, Turkey needs to carefully manage oversubscription to those schools considered better quality and mitigate continued inequities as advantaged students tend to have better access to information, private tutoring and quality schools in their area. Early analysis indicates an immediate reduction in the effect of school types and students’ socio-economic status on mean national assessment scores following the changes; continued monitoring over the long term is required to validate this.
Source: OECD (2020[56]), Education Policy Outlook: Turkey, https://www.oecd.org/education/policy-outlook/country-profile-Turkey-2020.pdf (accessed on 18 August 2021); Kitchen, H. et al. (2019[27]), OECD Reviews of Evaluation and Assessment in Education: Student Assessment in Turkey, https://doi.org/10.1787/5edc0abe-en.
Reduce the influence of teacher-assigned marks from the selection process
At present, school allocation is overseen by REDs and takes into account students’ results on the Grade 7 NEA as well as teacher-assigned marks for classwork. Although this approach could help to reduce the high-stakes nature of the NEA, in the Bulgarian context, it creates negative consequences that outweigh potential benefits. First, as individual schools have the full autonomy to determine the weight of teacher-assigned marks when considering applicants, including this criterion reduces transparency and increases the opportunity for schools and well-informed students and parents to manipulate the system. Second, there is evidence to suggest that teacher-assigned classroom assessments are neither very reliable (i.e. they are not consistent between classrooms across the country) nor valid (i.e. they do not assess the full spectrum of competencies covered within the curriculum, even for the core subjects). Finally, by including teacher-assigned marks, the current system encourages a negative backwash effect on the curriculum because teachers find themselves under pressure from students, parents and school leaders to give high marks in Bulgarian language and literature and mathematics throughout the academic year so their students can access the school of their choice. Once a new, more robust selection examination has been introduced, Bulgaria should considerably reduce the inclusion of teacher-assigned marks from the selection process. For example, the weight of teacher marks could be limited to a maximum of 20% of the admission score. This measure could be reviewed in the longer term, once classroom assessment has become more reliable and valid through reinforced training and support for teachers.
Support students to make more informed choices suited to their aptitudes and ambitions
The current selection process, which allows for an unlimited number of initial school choices, is not conducive to encouraging students – with support from parents and schools – to make careful decisions about their school pathways. As a result, students are more likely to take a default choice where they rank schools based on their perceived quality and reputation as opposed to their own interests, ambitions and academic ability. To counter this, Bulgaria should strengthen the guidance and information available to students at the end of lower secondary education. If students are to be expected to apply to schools and programmes that are more suited to their interests and ambitions, they need support to understand what different schools offer. Bulgaria should therefore require REDs to develop and maintain comprehensive information portals with guidance on the selection process, profiles for the different schools and programmes available in the region, as well as information related to further education and employment pathways beyond upper secondary education. Schools providing lower secondary education should also be required to provide information to their students about the different pathways and school types available to them. To encourage schools to strengthen this aspect of their role, supporting students with decisions about their future educational and career pathways could be integrated into the revisions of Bulgaria’s school quality standards and indicators (see Chapter 4).
In addition, specific targeted supports for disadvantaged learners will be required in order to help counteract inevitable asymmetries in access to information and guidance. Possible interventions could include incentivising highly competitive upper secondary schools to reach out to primary or lower secondary schools with higher shares of disadvantaged students to enhance the information those students receive about their options and support them in their application process. REDs could also be required to implement information outreach programmes with disadvantaged or vulnerable communities to ensure they have access to quality information about the options available to them and the processes to follow. Requiring REDs to include information about these efforts in their reporting to the Ministry could help engage them. Finally, schools providing lower secondary education should be encouraged to offer more individualised advice, particularly to disadvantaged students who may not have equal recourse to such advice from familial or social networks as their advantaged peers.
In the longer term, require schools to apply to become selective and delay selection
There is little evidence that education systems with academically selective schools have higher outcomes than non-selective systems and they often promote segregation either by academic ability or socio-economic status, or both. As such, these systems reduce the opportunity for the positive crossover effects that come from having more academically and socially diverse student cohorts, particularly for lower-performing or disadvantaged students. These positive effects include higher individual academic outcomes, lower dropout rates, more positive behaviours inside and outside the classroom and the development of broader social networks (Sacerdote, 2011[57]). Many of negative effects can be seen in Bulgaria as between-school segregation and differences in student outcomes are high compared to international peers. Although this challenge begins early, affecting primary and lower secondary schools too, the effects are heightened by the fact that, in Bulgaria, student selection occurs very early. Thus, the segregation that develops informally in earlier years of education becomes formalised once all students are sorted into upper secondary schools at age 13.
Over the longer term, Bulgaria should explore ways of reducing the level of academic selection and stratification in the system by minimising the pool of selective schools at the upper secondary level. This pool should be determined by quotas according to school type and should be applied regionally. In this way, Bulgaria can avoid having a homogenous pool of high-performing, elite gymnasia in general education and instead have a more heterogeneous pool of selective schools that encompasses general and vocational schools, as well as gymnasia, secondary and integrated schools. Moreover, via the quota, the pool of competitive schools could be more evenly distributed across the country; currently, the majority of elite, highly selective schools are located in the capital Sofia.
Schools that want to be selective would be required to apply to REDs to be considered in the quota. The criteria by which REDs then select schools should be established nationally by the Ministry to reduce any potential manipulation of the system. This criterion could be based on school evaluation performance as a means of incentivising continuous school improvement. It could also take into account outreach responsibilities for these more elite schools, such as partnering with a low-performing, non-selective school to provide peer learning, staff coaching and leadership support, for example. The criteria could also take into account the socio-economic diversity of the school, favouring more diverse schools and thus encouraging further outreach efforts. Applying this process annually would be burdensome for REDs and potentially very disruptive for students, parents and school staff. Therefore, the school application process should be repeated every five years (i.e. a complete cycle of upper secondary education).
Although a considerable reform, the introduction of quotas alone will not be enough to minimise school segregation entirely. Addressing this challenge fully will require further structural reforms that look deeper into the architecture of the Bulgarian education system. First, to help ensure all students in Bulgaria achieve a minimum level of competency in core curriculum areas and to keep a range of pathways available as students mature and their ambitions develop, Bulgaria must seriously consider delaying the student selection process until the end of compulsory education (i.e. Grade 10). In introducing academic selection and the transition to upper secondary education and specialisation at age 13, Bulgaria is among a minority of countries; across the OECD, the most common age of first tracking is 16. Although some other systems do have early tracking and some even earlier than Bulgaria, in many cases, these approaches are in the process of being reformed (e.g. Germany, Turkey), occur within countries with generally more equal societies and so tracking is not as strongly tied to socio-economic status (e.g. Austria, the Czech Republic) or within systems in which there is less of a perceived hierarchy between different pathways and school types (e.g. the Netherlands, Switzerland).
Knowing that segregation within the Bulgarian system starts even earlier than the formal selection process at the transition to upper secondary, measures to address stratification in lower levels of schooling should also be considered. This could include removing selection processes at earlier ages for the small number of elite specialised schools or reforming school admission practices at the primary level.
Recommendation 2.3.2. Enhance the validity of the State Matriculation examination to ensure it more fully fulfils its dual purpose
Examinations, like all quality assessments, must demonstrate both high reliability and validity. Reliability refers to the extent to which the assessment is consistent in measuring what it sets out to measure; validity refers to how appropriate an assessment is in relation to its objectives (OECD, 2013[14]). At present, Bulgaria’s State Matriculation examination demonstrates a relatively high degree of reliability, which is critical given the stakes it carries for students. This is an important achievement in a context where trust in government processes tends to be low and the perceived risks of corruption high. However, its validity could be improved, both in terms of certifying achievement against national learning standards and signalling suitability for transition to higher education. This is particularly important given the backwash that the examination has on what is taught, learned and assessed throughout upper secondary education. Building on its success in strengthening the reliability of the State Matriculation examination over the last decade, Bulgaria must now think more strategically about leveraging this examination to better support its curriculum and broader skills objectives. This can be achieved by expanding the breadth of core and transversal competencies assessed by the examination and ensuring it more consistently discriminates student performance.
Improve alignment with the competency-based curriculum and curricular priorities
Ensure examinations include items that ask students to apply their knowledge and skills in relevant, practical contexts. State Matriculation examinations for all subjects should include items that use authentic data and/or sources and are set in real-world contexts. For example, in Bulgarian language and literature, students could be asked to engage with a wide range of literary and non-literary texts covering different forms and media, rather than solely using traditional texts from the literary canon. Tasks in mathematics could require students to use authentic data and practical contexts in addition to assessing more abstract knowledge of mathematical formulae. The Center for Assessment could provide item writers with clearer guidance on this requiring, for example, at least 20 points (out of 100) to be assigned to items of this nature. This requirement could be a starting point with the weighting of such tasks increasing over time as item writers, teachers and test takers become more familiar with them. Developing such items will need to be an ongoing effort and their quality should be reviewed annually to improve items for the following year.
Introduce a compulsory examination in mathematics. Currently, only 5-10% of Bulgarian students choose to take a State Matriculation examination in mathematics each year, despite the fact that mathematics remains a compulsory subject in the upper secondary curriculum. This contrasts considerably with other countries in the region where mathematics is a compulsory examination subject. Like Bulgarian language and literature, which is also a compulsory part of the curriculum, mathematics should become a compulsory subject in the examination. This will both help to raise its profile during upper secondary education and incentivise better teaching and learning. At the same time, differentiation will be required between those students who study mathematics within their selected profile and those who only study it within the compulsory curriculum. Bulgaria should therefore consider offering different types of mathematics examinations for students to choose from. In Norway, all students take an examination in mathematics but students in social science studies take “Mathematics S” courses while natural science and mathematical students take “Mathematics R”, which has a stronger focus on pure mathematics and a small amount of probability (Maghnouj et al., 2019[21]). In England, at the end of upper secondary education, although mathematics is not a compulsory subject, students seeking a qualification in mathematics have multiple subjects to choose from. For example, students can opt to take a final examination in core mathematics (focused on practical skills to be applied in work, study or everyday life), mathematics (for those studying general mathematics at this level) or further mathematics (focused on advanced mathematics skills as a bridge to further study in tertiary education).
Establish a school-based component that provides an opportunity to assess broader skills. As more OECD countries have adopted competency-based curricula, there has been a growing interest in performance-based assessments, such as experiments or projects, which require students to mobilise a wider range of skills and knowledge, and demonstrate complex competencies like critical thinking and problem solving (OECD, 2013[14]). Integrating an assessment approach of this nature within a national examination can help balance central expertise and teacher ownership to facilitate maximum validity and reliability. Bulgaria should therefore consider creating an additional compulsory requirement for all students to complete a school-based project. These project-based assignments would be long‑term, in-depth projects that students complete within their school by applying skills they learned prior to the examination. These projects should be practical and aim to assess interdisciplinary competencies. In Bulgaria, this approach could build on the growing enthusiasm for project-based learning that is a key feature of innovative approaches to teaching and learning in the cohort of innovative schools (see Chapter 4). Students could be awarded a final mark for their work which is equivalent to a final examination mark. The project could be embedded within the compulsory hours already mapped out in the upper secondary curriculum for civic education, helping to raise the profile of this subject and ensure that students have adequate time and guidance to carry out a quality project.
Enhance the examination’s power of discrimination to ensure it is a useful indicator of student proficiency
To support students’ transitions into pathways beyond school, the results of upper secondary education examinations should help illustrate where a student’s strengths lie and accurately signal to future education providers or employers a student’s level of competency in the relevant subject. Currently, Bulgaria’s State Matriculation examination is not fully fulfilling this role because high shares of students in several subjects are awarded a mark deemed “excellent”. At the same time, the distribution of marks varies considerably between subjects meaning that an “excellent” in one subject may indicate a level of proficiency that is not matched by an “excellent” in another subject. This leads to speculation about the perceived difficulty of certain subjects, which may be influencing students’ choices about which subjects to take more than their own ambitions or aptitudes. Bulgaria could benefit from taking the following steps to increase the State Matriculation examination’s ability to discriminate between students’ different performance levels:
Remove the pre-determined pass/fail cut score. Currently, any students scoring less than 30% on the State Matriculation examination are deemed to have failed and must retake the examination. This approach is very transparent and easy to communicate to all stakeholders. However, it fails to take into account variations in the level of difficulty in the examinations from year to year. This could be misleading as an increase in the pass rate may be caused by the inadvertent use of an easier test rather than an increase in the absolute level of achievement. In addition, this approach fails to link results in the examination to the expected levels of achievement, as expressed in the national curriculum. Bulgaria should move towards a criteria-related system for awarding a pass or fail. Specifically, test items and student responses should be analysed against expected levels of achievement so that, for a student to pass the examination, examiners must judge them to have achieved minimum proficiency in pre-established standards (OECD, 2013[14]). Many OECD countries use this approach, such as Latvia, Lithuania and Slovenia, when marking national examinations at the end of upper secondary education. Although this approach can be more complicated to explain to the general public, it provides a more meaningful interpretation of student success and can also be a more useful tool for teaching, learning and assessment.
Investigate the disparities in achievement across subjects. Differences in the share of top (excellent) and bottom (fail) marks awarded to students taking different subjects require further investigation to be better understood. For example, 60% of students taking the examination in chemistry and the environment receive a mark equivalent to “excellent” and 1% “fail”, compared to those taking the examination in geography and economics, of whom 4% receive an “excellent” and 16% “fail”. There are many possible reasons for this. It could be that the chemistry examination is easier or that the standard of teaching across schools is higher. The OECD review team heard that it may be due to the profile of students choosing those subjects: geography is a popular choice among students in VET schools, whose overall achievement in upper secondary education may be lower than students in general schools. It may also be linked to the fact that a much higher number of students take the geography examination compared to chemistry (4 430 students compared to 184 students in 2020). Reasons for the difference between results in these two subjects may not be the same as the explanation for differences between other subjects. Each of these possible causes requires attention. To identify and address the root cause of these imbalances, Bulgaria should undertake a comprehensive investigation of the issue. This could be overseen by the Center for Assessment but should include a review panel composed of independent experts who were not previously involved in examination design and marking processes.
Table 2.5. Table of recommendations
Policy Issues |
Recommendations |
Action Points |
---|---|---|
Building a shared understanding of student assessment as a means to support teaching and learning |
Establish a coherent national vision of student assessment |
Formulate a high-level national vision of student assessment |
Engage stakeholders in developing the new vision of student assessment |
||
Clarify and better communicate expected learning outcomes to guide student assessment |
||
Ensure alignment and coherence with wider evaluation and assessment practices |
||
Communicate the vision in a strategic way to build trust and support for change |
||
Adapt the reporting of student learning information to promote a broader understanding of assessment |
Make classroom and school-level marking practices more conducive to student learning |
|
Strengthen reporting to help students and parents understand broader progress |
||
Developing the capacity of teachers to use formative assessment |
Promote the use of diagnostic assessments to help teachers better understand and adapt to the learning needs of students |
Prioritise younger students and core subjects to have greater impact in the long term |
Support teachers to make full use of start-of-year diagnostic assessments |
||
Foster real change at classroom-level through making training on formative assessment a priority for all teachers |
Strengthen the development of formative assessment practices in initial teacher education (ITE) |
|
Ensure that teachers have access to quality continuous professional development on formative assessment |
||
Equip teachers with a range of practical supports to facilitate formative assessment in the classroom |
Provide teachers with resources to support formative assessment practices |
|
Build capacity at regional level to support teachers’ formative assessment |
||
Enhancing the validity and fairness of examination and selection process into and out of upper secondary education |
Reform the selection process into upper secondary education to increase equity and facilitate quality learning in Grade 7 |
Introduce a new standardised examination specifically for the selection process |
Design a selection examination that assesses a broader set of competencies to better inform selection into different pathways |
||
In the short term, enable students to opt in to the selection process |
||
Reduce the influence of teacher-assigned marks from the selection process |
||
Support students to make more informed choices suited to their aptitudes and ambitions |
||
In the longer term, require schools to apply to become selective and delay selection |
||
Enhance the validity of the State Matriculation examination to ensure it more fully fulfils its dual purpose |
Improve alignment with the competency-based curriculum and curricular priorities |
|
Enhance the examination’s power of discrimination to ensure it is a useful indicator of student proficiency |
References
[50] AITSL (n.d.), Tools and Resources, Australian Institute for Teaching and School Leadership, https://www.aitsl.edu.au/tools-resources (accessed on 18 August 2021).
[33] Andrews, J., J. Hutchinson and R. Johnes (2016), Grammar Schools and Social Mobility, Education Policy institute, https://epi.org.uk/wp-content/uploads/2018/01/Grammar-schools-and-social-mobility_.pdf (accessed on 18 August 2021).
[16] Black, P. and D. Wiliam (2018), “Classroom assessment and pedagogy”, Assessment in Education: Principles, Policy & Practice, Vol. 25/6, https://doi.org/10.1080/0969594X.2018.1441807.
[47] Black, P. and D. Wiliam (1998), “Assessment and classroom learning”, Assessment in Education: Principles, Policy & Practice, Vol. 5/1, http://dx.doi.org/10.1080/0969595980050102.
[34] Daniell, G. (2018), “Dead ends and doorways: Attainment and equity in upper secondary school qualifications pathways”, University of Auckland, https://researchspace.auckland.ac.nz/bitstream/handle/2292/45083/whole.pdf?sequence=2&isAllowed=y (accessed on 18 August 2021).
[5] Darling-Hammond, L. and L. Wentworth (2010), Benchmarking Learning Systems: Student Performance Assessment in International Context, Stanford University.
[9] Dimitrova, Z. and S. Lazarov (2020), “Проучване - Kлючови Kомпетентности и Yмения за Yспех – Oт Закон към Практика [Research - Key competences and skills for success - From law to practice]”, Education Bulgaria 2030, http://dx.doi.org/10.13140/RG.2.2.15222.42567 (accessed on 18 August 2021).
[49] Earl, L. (2007), “Assessment as learning”, in Hawley, W. and D. Rollie (eds.), The Keys to Effective Schools: Educational Reform as Continuous Improvement, SAGE Publications.
[13] EC (2010), “Assessment of key competences: Draft background paper for the Belgian Presidency meeting for Directors-General for school education”, European Commission.
[29] EC/EACEA/Eurydice (2020), Equity in School Education in Europe: Structures, Policies and Student Performance, Publications Office of the European Union, https://doi.org/10.2797/286306.
[23] EC/EACEA/Eurydice (2020), The Organisation of School Time in Europe: Primary and General Secondary Education 2020/21, Publications Office of the European Union.
[17] Elliott, V. et al. (2016), A Marked Improvement? A Review of the Evidence on Written Marking, Education Endowment Foundation, https://educationendowmentfoundation.org.uk/public/files/Presentations/Publications/EEF_Marking_Review_April_2016.pdf (accessed on 18 August 2021).
[6] Government of Bulgaria (2020), Cтратегическа Pамка за Pазвитие на Oбразованието, Oбучението и Yченето в Pепублика българия (2021-30) [Strategic framework for development of Education, training and learning in Bulgaria (2021-30)], Government of Bulgaria, https://epale.ec.europa.eu/sites/default/files/strategicheska_ramka_za_obrazovanieto_obuchenieto_i_ucheneto_v_republika_blgariya_2021_-_2030l.pdf (accessed on 18 August 2021).
[7] Government of Bulgaria (2020), National Recovery and Resilience Plan of the Republic of Bulgaria, Government of Bulgaria.
[38] Hipkins, R. and M. Cameron (2018), Trends in Assessment: An Overview of Themes in the Literature, New Zealand Council for Educational Research, https://www.nzcer.org.nz/system/files/Trends%20in%20assessment%20report.pdf (accessed on 18 August 2021).
[52] Hopfenbeck, T. et al. (2013), “Balancing Trust and Accountability? The Assessment for Learning Programme in Norway: A Governing Complex Education Systems Case Study”, OECD Education Working Papers, No. 97, OECD Publishing, Paris, https://dx.doi.org/10.1787/5k3txnpqlsnn-en.
[48] Innove (n.d.), Diagnostic Tests, https://www.innove.ee/oppevara-ja-metoodikad/digioppevara/e-kogud/diagnostilised-testid/ (accessed on 18 August 2021).
[27] Kitchen, H. et al. (2019), OECD Reviews of Evaluation and Assessment in Education: Student Assessment in Turkey, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/5edc0abe-en.
[2] Kitchen, H. et al. (2017), Romania 2017, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264274051-en.
[26] Kovacheva, D. (2020), Отмяна на изпитите от Национално външно оценяване за IV клас и X клас за учебната 2019/20 г. във връзка с пандемият от коронавирус [Cancellation of the National External Assessment Examinations for Grades IV and X for the 2019/20 School Year in Connection, Ombudsman of Bulgaria, https://www.ombudsman.bg/pictures/%D0%9F%D1%80%D0%B5%D0%BF%D0%BE%D1%80%D1%8A%D0%BA%D0%B0%20%D0%9C%D0%9E%D0%9D.pdf (accessed on 18 August 2021).
[30] Levin, V., S. Guallar Artal and A. Safir (2016), “Skills for work in Bulgaria: The relationship between cognitive and socioemotional skills and labor market outcomes”, World Bank, Washington, DC, http://documents.worldbank.org/curated/en/508441467991943114/Skills-for-work-in-Bulgaria-the-relationship-between-cognitive-and-socioemotional-skills-and-labor-market-outcomes (accessed on 27 October 2021).
[1] Li, R. et al. (2019), OECD Reviews of Evaluation and Assessment in Education: Georgia, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/94dc370e-en.
[25] Maghnouj, S. et al. (2020), OECD Reviews of Evaluation and Assessment in Education: Albania, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/d267dc93-en.
[21] Maghnouj, S. et al. (2019), OECD Reviews of Evaluation and Assessment in Education: Serbia, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/225350d9-en.
[4] Ministry of Education and Science (2020), OECD Review of Evaluation and Assessment: Country Background Report for Bulgaria, Ministry of Education and Science of Bulgaria, Sofia.
[11] Ministry of Education and Science (2020), Учебна Програма по Математика за Дванадесети Клас - Общообразователна Подготовка [Mathematics Curriculum for Grade XII - General Education], Ministry of Education and Science of Bulgaria, Sofia, https://www.mon.bg/upload/16251/UP_XII_Math.pdf (accessed on 18 August 2021).
[8] Ministry of Education and Science (2019), Kомпетентности и Oбразование [Competences and Education], Ministry of Education and Science of Bulgaria, Sofia, https://www.mon.bg/upload/21560/I-book.pdf (accessed on 18 August 2021).
[22] Ministry of Education and Science (2017), Учебна Програма по Български Език и Литература за іi Клас [Curriculum in Bulgarian Lanugage and Literature Grade II], Ministry of Education and Science of Bulgaria, Sofia, https://www.mon.bg/upload/13418/UP_2kl_BEL_ZP.pdf (accessed on 18 August 2021).
[12] Ministry of Education and Science (2016), Наредба No. 11 от 1 Септември 2016 г. за Оценяване на Резултатите от Обучението на Учениците [Ordinance No.11 of 01 September 2016 for the Evaluation of the Results of Student Learning], Ministry of Education and Science of Bulgaria, Sofia, https://www.lex.bg/en/laws/ldoc/2136905302 (accessed on 18 August 2021).
[10] Ministry of Education and Science (2016), Учебна Програма по История и Цивилизации за vi Клас - Общообразователна Подготовка [Curriculum in History and Civilizations for Grade VI - General Education], Ministry of Education and Science of Bulgaria, Sofia.
[53] Morris, A. (2011), “Student Standardised Testing: Current Practices in OECD Countries and a Literature Review”, OECD Education Working Papers, No. 65, OECD Publishing, Paris, https://dx.doi.org/10.1787/5kg3rp9qbnr6-en.
[20] Muskin, J. (2017), Continuous Assessment for Improved Teaching and Learning: A Critical Review to Inform Policy and Practice, Current and Critical Issues in Curriculum, Learning and Assessment, International Bureau of Education, UNESCO, https://unesdoc.unesco.org/ark:/48223/pf0000255511 (accessed on 18 August 2021).
[15] Nusche, D. et al. (2014), OECD Reviews of Evaluation and Assessment in Education: Netherlands 2014, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264211940-en.
[37] Nusche, D. et al. (2012), OECD Reviews of Evaluation and Assessment in Education: New Zealand 2011, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264116917-en.
[36] NZ Ministry of Education (2011), Ministry of Education Position Paper: Assessment [Schooling Sector], Ministry of Education of New Zealand.
[46] OECD (2020), Education Policy Outlook: Denmark, OECD, Paris, https://www.oecd.org/education/policy-outlook/country-profile-Denmark-2020.pdf (accessed on 18 August 2021).
[40] OECD (2020), Education Policy Outlook: Ireland, OECD, Paris, http://www.oecd.org/education/policy-outlook/country-profile-Ireland-2020.pdf (accessed on 27 October 2021).
[51] OECD (2020), Education Policy Outlook: Norway, OECD, Paris, https://www.oecd.org/education/policy-outlook/country-profile-Norway-2020.pdf (accessed on 18 August 2021).
[56] OECD (2020), Education Policy Outlook: Turkey, OECD, Paris, https://www.oecd.org/education/policy-outlook/country-profile-Turkey-2020.pdf (accessed on 18 August 2021).
[19] OECD (2020), Lessons for Education from COVID-19: A Policy Maker’s Handbook for More Resilient Systems, OECD Publishing, Paris, https://dx.doi.org/10.1787/0a530888-en.
[18] OECD (2020), PISA 2018 Results (Volume V): Effective Policies, Successful Schools, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/ca768d40-en.
[42] OECD (2019), Education Policy Outlook 2019: Working Together to Help Students Achieve their Potential, OECD Publishing, Paris, https://dx.doi.org/10.1787/2b8ad56e-en.
[3] OECD (2019), OECD Reviews of Evaluation and Assessment in Education: North Macedonia, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/079fe34c-en.
[32] OECD (2019), PISA 2018 Results (Volume II): Where All Students Can Succeed, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/b5fd1b8f-en.
[35] OECD (2017), Education at a Glance 2017: OECD Indicators, OECD Publishing, Paris, https://dx.doi.org/10.1787/eag-2017-en.
[24] OECD (2015), Education at a Glance 2015: OECD Indicators, OECD Publishing, Paris, https://dx.doi.org/10.1787/eag-2015-en.
[14] OECD (2013), Synergies for Better Learning: An International Perspective on Evaluation and Assessment, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264190658-en.
[41] Ofsted (2012), The Impact of the ‘Assessing Pupils’ Progress“ Initiative, Ofsted, https://www.bl.uk/britishlibrary/~/media/bl/global/social-welfare/pdfs/non-secure/i/m/p/impact-of-the-assessing-pupils-progress-initiative.pdf (accessed on 18 August 2021).
[43] Portuguese Ministry of Education (2021), Inicio [Home], https://afc.dge.mec.pt/ (accessed on 27 October 2021).
[57] Sacerdote, B. (2011), “Peer effects in education: How might they work, how big are they and how much do we know thus far?”, in Hanushek, E., S. Machin and L. Woessmann (eds.), Handbook of the Economics of Education, Elsevier B.V.
[45] Shewbridge, C. et al. (2011), OECD Reviews of Evaluation and Assessment in Education: Denmark 2011, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264116597-en.
[55] van der Lubbe, M. (n.d.), “The end of primary school test”, Central Institute for Test Development, https://docplayer.net/21810435-The-end-of-primary-school-test-marleen-van-der-lubbe-cito-the-netherlands.html (accessed on 27 October 2021).
[54] Vernon, P. (2017), Secondary School Selection: A British Psychological Society Inquiry, Routledge.
[39] Viennet, R. and B. Pont (2017), “Education policy implementation: A literature review and proposed framework”, OECD Education Working Papers, No. 162, OECD Publishing, Paris, https://dx.doi.org/10.1787/fc467a64-en.
[44] Wisniewski, B., K. Zierer and J. Hattie (2020), “The power of feedback revisited: A meta-analysis of educational feedback research”, Frontiers in Psychology, https://doi.org/10.3389/fpsyg.2019.03087.
[31] Woessmann, L. (2009), International Evidence on School Tracking: A Review, https://www.researchgate.net/publication/227353175_International_Evidence_on_School_Tracking_A_Review/citation/download (accessed on 27 October 2021).
[28] Zwier, D., S. Geven and H. van de Werfhorst (2021), “Social inequality in shadow education: The role of high-stakes testing”, International Journal of Comparative Sociology, Vol. 61/6, https://doi.org/10.1177%2F0020715220984500 (accessed on 18 August 2021).
Note
← 1. Bulgaria’s National Recovery and Resilience Plan is an investment and reform plan for 2021-26. It is part of Bulgaria’s involvement in the European Commission’s NextGenerationEU recovery instrument, through which the commission supports member countries to repair immediate economic and social damage resulting from the COVID-19 pandemic.