Albania has started to establish some of the components integral to system evaluation, including the development of a modern education management information system (EMIS). However, the lack of processes and capacity needed to conduct system evaluation, as well as low government demand for evidence, limits Albania’s ability to use evaluation information for system improvement. This chapter recommends that Albania integrate evaluation more centrally into the new national education strategy and strengthen the institutional capacity needed to support a culture of system evaluation. This chapter also reviews Albania’s national assessment system and EMIS development plans, offering recommendations to ensure these tools support strategic planning and national education goals.
OECD Reviews of Evaluation and Assessment in Education: Albania
Chapter 5. Strengthening capacity to evaluate system performance
Abstract
Introduction
System evaluation is central to improving educational performance. Evaluating an education system holds the government and other stakeholders accountable for meeting national goals and provides the information needed to develop effective policies. Albania has started to establish some of the components integral to system evaluation (Table 5.1). For example, the Ministry of Education, Sports and Youth (hereby, the ministry) has tasked its longstanding statistics sector and the newly established Agency for Quality Assurance in Pre-University Education (hereby, the Quality Assurance Agency), with monitoring education system performance. The Educational Services Centre (ESC) is also working to integrate various databases and develop a modern education management information system (EMIS), called Socrates, which by 2020 will store information related to students, teachers, curriculum and schools in pre-tertiary education. Nevertheless, progress towards improving system evaluation in Albania is uneven and the government demand for evidence that could help inform education policy is generally low. As a result, Albania lacks the impetus to address capacity constraints and further develop the tools needed for comprehensive system evaluation.
This chapter suggests several measures that Albania can take to develop stronger capacity for conducting system evaluation and better co‑ordinate the actors who contribute to this process. Since system evaluation relies on high-quality evaluation tools, data collection and management is one area of focus within this chapter. Without timely and trustworthy data, actors will not have a clear understanding of what is happening in the education system and where improvements should be made. Strengthening the national assessment system will be crucial to providing more reliable data that can help monitor educational progress and provide formative information for educational improvement. Taken together, these investments in system evaluation will support strategic planning and help Albania achieve national education goals.
Key features of effective system evaluation
System evaluation refers to the processes that countries use to monitor and evaluate the performance of their education systems (OECD, 2013[1]). A strong evaluation system serves two main functions: to hold the education system, and the actors within it, accountable for achieving their stated objectives; and, by generating and using evaluation information in the policy-making process, to improve policies and ultimately education outcomes (see Table 5.1). System evaluation has gained increasing importance in recent decades across the public sector, in part because of growing pressure on governments to demonstrate the results of public investment and improve efficiency and effectiveness (Schick, 2003[2]).
In the education sector, countries use information from a range of sources to monitor and evaluate quality and track progress towards national objectives (see Table 5.1). As well as collecting rich data, education systems also require “feedback loops” so that information is fed back into the policy-making process (OECD, 2017[3]). This ensures goals and policies are informed by evidence, helping to create an open and continuous cycle of organisational learning. At the same time, in order to provide public accountability, governments need to set clear responsibilities – to determine which actors should be accountable and for what – and make information available in timely and relevant forms for public debate and scrutiny. All of this constitutes a significant task, which is why effective system evaluation requires central government to work across wider networks (Burns and Köster, 2016[4]). In many OECD countries, independent government agencies such as national audit offices, evaluation agencies, the research community and sub‑national governments, play a key role in generating and exploiting available information.
A national vision and goals provide standards for system evaluation
Like other aspects of evaluation, system evaluation must be anchored in a national vision and/or goals, which provide the standards against which performance can be evaluated. In many countries, these are set out in an education strategy that spans several years. An important complement to a national vision and goals are targets and indicators. Indicators are the quantitative or qualitative variables that help to monitor progress (The World Bank, 2004[5]). Indicator frameworks combine inputs like government spending, outputs like teacher recruitment, and outcomes like student learning. While outcomes are notoriously difficult to measure, they are a feature of frameworks in most OECD countries because they measure the final results that a system is trying to achieve (OECD, 2009[6]). Goals also need to balance the outcomes a system wants to achieve, with indicators for the internal processes and capacity throughout the system that are required to achieve these outcomes (Kaplan, R.S. and D.P. Norton, 1992[7]).
Reporting against national goals supports accountability
Public reporting of progress against national goals enables the public to hold government accountable. However, the public frequently lacks the time and information to undertake this role, and tends to be driven by individual or constituency interests rather than broad national concerns (House of Commons, 2011[8]). This means that objective and expert bodies like national auditing bodies, parliamentary committees and the research community play a vital role in digesting government reporting and helping to hold the government to account.
An important vehicle for public reporting is an annual report on the education system (OECD, 2013[1]). In many OECD countries, such a report is now complemented by open data. If open data is to support accountability and transparency, it must be useful and accessible. Many OECD countries use simple infographics to present complex information in a format that the general public can understand. Open data should also be provided in a form that is re-usable, i.e. other users can download and use it in different ways, so that the wider evaluation community like researchers and non-governmental bodies can analyse data to generate new insights (OECD, 2018[9]).
National goals are a strong lever for governments to direct the education system
Governments can use national goals to give coherent direction to education reform across central government, sub-national governance bodies and individual schools. For this to happen, goals should be specific, measurable, feasible and above all, relevant to the education system. Having a clear sense of direction is particularly important in the education sector, given the scale, multiplicity of actors and the difficulty in retaining focus in the long‑term process of achieving change. In an education system that is well‑aligned, national goals are embedded centrally in key reference frameworks, encouraging all actors to work towards their achievement. For example, national goals that all students reach minimum achievement standards or that teaching and learning foster students’ creativity are reflected in standards for school evaluation and teacher appraisal. Through the evaluation and assessment framework, actors are held accountable for progress against these objectives.
Tools for system evaluation
Administrative data about students, teachers and schools are held in central information systems
In most OECD countries, data such as student demographic information, attendance and performance, teacher data and school characteristics are held in a comprehensive data system, commonly referred to as an EMIS. Data are collected according to national and international standardised definitions, enabling data to be collected once, used across the national education system and reported internationally. An effective EMIS also allows users to analyse data and helps disseminate information about education inputs, processes and outcomes (Abdul-Hamid, 2014[10]).
National and international assessments provide reliable data on learning outcomes
Over the past two decades, there has been a major expansion in the number of countries using standardised assessments. The vast majority of OECD countries (30), and an increasing number of non-member countries, have regular national assessments of student achievement for at least one level of the school system (OECD, 2015[11]). This reflects the global trend towards greater demand for outcomes data to monitor government effectiveness, as well as a greater appreciation of the economic importance of all students mastering essential skills.
The primary purpose of a national assessment is to provide reliable data on student learning outcomes that are comparative across different groups of students and over time (OECD, 2013[1]). Assessments can also serve other purposes such as providing information to teachers, schools and students to enhance learning and supporting school accountability frameworks. Unlike national examinations, they do not have an impact on students’ progression through grades. When accompanied by background questionnaires, assessments provide insights into the factors influencing learning at the national level and across specific groups. While the design of national assessments varies considerably across OECD countries, there is consensus that having regular, reliable national data on student learning is essential for both system accountability and improvement.
An increasing number of countries also participate in international assessments like the OECD Programme for International Student Assessment (PISA) and the two programmes of the International Association for the Evaluation of Educational Achievement, Trends in International Mathematics and Science Study (TIMSS) and Progress in International Reading Literacy Study (PIRLS). These assessments provide countries with periodic information to compare learning against international benchmarks as a complement to national data.
Thematic reports complement data to provide information about the quality of teaching and learning processes
Qualitative information helps to contextualise data and provide insights into what is happening in a country’s classrooms and schools. For example, school evaluations can provide information about the quality of student-teacher interactions and how a principal motivates and recognises staff. Effective evaluation systems use such findings to help understand national challenges – like differences in student outcomes across schools.
A growing number of OECD countries undertake policy evaluations
Despite an increased interest across countries in policy evaluations, it is rarely systematic at present. Different approaches include evaluation shortly after implementation, and ex ante reviews of major policies to support future decision-making (OECD, 2018[12]). Countries are also making greater efforts to incorporate evidence to inform policy design, for example, by commissioning randomised control trials to determine the likely impact of a policy intervention.
Effective evaluation systems requires institutional capacity within and outside government
System evaluation requires resources and skills within ministries of education to develop, collect and manage reliable, quality datasets and to exploit education information for evaluation and policy-making purposes. Capacity outside or at arms-length from ministries is equally important, and many OECD countries have independent evaluation institutions that contribute to system evaluation. Such institutions might undertake external analysis of public data, or be commissioned by the government to produce annual reports on the education system and undertake policy evaluations or other studies. In order to ensure that such institutions have sufficient capacity, they may receive public funding, but their statutes and appointment procedures ensure their independence and the integrity of their work.
System evaluation in Albania
Albania has taken steps to establish some of the integral components needed to perform system evaluation (see Table 5.1). For example, Albania has several bodies with responsibilities for system evaluation, a national assessment of student learning in Grade 5 and a new EMIS currently in development that promises to modernise the way education data is collected and managed. While these are positive features that can certainly help support system evaluation, the current tools for this process remain nascent. Despite the fact that Socrates has been under development for many years, Albania’s lack of a functional EMIS is a striking gap compared to other OECD and developing countries. Moreover, while other countries in the Western Balkans have struggled to introduce a national assessment for system‑monitoring purposes, Albania’s assessment does not currently provide comparable results at the national level. In order to conduct comprehensive and co‑ordinated system evaluation, Albania needs to address capacity constraints and invest in high‑quality evaluation tools.
Table 5.1. System evaluation in Albania
References for national vision and goals |
Tools |
Body responsible |
Outputs |
---|---|---|---|
|
Administrative data |
Educational Services Centre (ESC) Monitoring, Priorities and Statistics Sector (in MoESY) |
EMIS (in pilot phase) Annual and ad-hoc education statistics releases; periodic reporting on government strategies |
National assessment |
ESC |
Assessment of Primary Education Pupils’ Achievement (VANAF, Grade 5): Albanian language; mathematics; science |
|
International assessments |
ESC |
|
|
School evaluations |
*State Education Inspectorate |
Annual report on the quality of the education process in schools (based on comprehensive and thematic school inspections) |
|
Policy evaluations |
No established process |
Some examples of outputs are:
|
|
Reports and research |
MoESY and *specialised agencies (e.g. ESC, State Education Inspectorate, Educational Development Institute) Donors and non‑governmental organisations |
No overall report on the education system; various specialised agencies periodically report on their respective areas of work Important providers of ad-hoc research and analysis |
Note: *In 2019, the State Education Inspectorate merged with the Educational Development Institute to form the Agency for Quality Assurance in Pre-University Education, and the school inspection function is now being fulfilled by the General Directorate for Pre-University Education (see Chapter 1).
Source: MoESY (2018[13]), OECD Review of Evaluation and Assessment: Country Background Report for Albania, Ministry of Education, Sports and Youth, Tirana; Wort, M., D. Pupovci and E. Ikonomi (2019[14]), Appraisal of the Pre-University Education Strategy 2014-2020, UNICEF Albania, Tirana.
High-level documents express a national vision for pre-tertiary education
Albania’s Pre-University Education Strategy 2014-2020 (hereby, the strategy) provides a vision for education focused on equipping students with the competencies and self-esteem needed for success in a global, diverse and technological world (MoESY, 2016[15]). The strategy sets out four broad policy objectives (see Chapter 1) that aim to improve Albania’s pre‑tertiary education system in the areas of governance, inclusion, quality assurance and the professional development of teachers and school leaders. These areas were identified as key challenges in a report prepared by the ministry and education experts in 2014 (MoESY, 2014[16]). The report underwent a wide consultation process with international, national and local stakeholders who helped shape the strategy’s contents. The current strategy does not address higher education nor vocational education and training.
This high-level document offers continuity with Albania’s previous education strategies and reform programmes. The consistency has helped Albania achieve important structural changes in recent years, such as the introduction of a new competency-based curriculum, the extension of compulsory education from eight to nine years and the establishment of transparent processes for recruiting teachers and school leaders (see Chapter 1). The education strategy is also aligned with the overarching National Strategy for Development and Integration 2014-2020, which provides a national vision for social and economic development that aims to bring the country closer to European Union (EU) accession (Republic of Albania, 2013[17]).
Indicators and targets are not aligned to drive system improvement
Albania’s education strategy contains many important components. In particular, it provides an overview of expected results, a financial summary and deadlines for implementation. The strategy also includes an annex with a set of indicators to measure progress; however, these are generally limited to monitoring inputs and outputs, such as how many students have tablets or how many schools have libraries (MoESY, 2018[13]). A noticeable gap in the strategy’s indicator framework is the lack of clear targets, which were an important feature in earlier drafts of the strategy. For example, the draft strategy’s indicator framework set targets to raise the amount of per student spending to 999 ALL and to increase the number of inspected educational institutions to 75% by 2020 (MoESY, 2014[18]). However, these targets were removed from the final version of the strategy that was adopted by the government. This is significant because Albania’s education sector is underfunded and many of the actions proposed in the strategy cannot be achieved without an increase in funding. The lack of alignment between indicators and targets make it difficult for the education strategy to drive system improvement.
Implementation planning is relatively weak, making it hard to co-ordinate and align actors behind strategic goals
Albania’s education strategy includes an implementation plan that outlines main activities, expected outcomes, main contact points and timelines. Despite this, the plan provides limited direction to co-ordinate education actors. This is partly because the implementation plan does not systematically translate actions from the main text into clearly sequenced steps for implementation. For example, only three of six sub-actions proposed to develop data infrastructure in the main text of the strategy are included as steps in the implementation plan: developing the software, conducing analysis of data and preparing targeted reports for different audiences (MoESY, 2016[15]).
Other actions the strategy outlines as important steps needed to establish the new EMIS, such as installing the infrastructure, implementing it and training teams responsible for maintenance and data processing, are excluded from the implementation plan. As a result, the plan does not fully fulfil its function of aligning efforts and resources behind Albania’s strategic goals.
Moreover, despite the fact that the current strategy calls for the development of annual sector-wide implementation plans, in reality different agencies develop their own annual work plans, which in turn, make up the “components” of a National Plan for Education (MoESY, 2018[13]). This hinders the government’s ability to organise efforts across agencies and allocate resources according to priorities. For example, the overarching implementation plan identifies the ministry and Regional Education Directorates (REDs) as the bodies responsible for building the country’s EMIS. However, the ministry has assigned the ESC this task and there has been no subsequent engagement from the ministry and REDs. The lack of strong sector-wide planning and co-ordination makes it difficult for Albania to establish a central source of information about the education system.
Discussions about the future of the education strategy are underway
There are ongoing discussions within the ministry about what to do once the current education strategy ends in 2020. Options include extending the existing strategy or developing a new strategy that will cover the whole education sector, including higher education and vocational education. This next strategy will direct reform during a critical period for Albania’s national development and potential accession to the EU. Merely extending the current strategy would represent a missed opportunity to evaluate the strengths and weaknesses of the education system relative to this changing landscape, and to develop a strategic set of policy goals for the future.
The process of developing a new strategy is as important as the content itself. Albania plans to continue the existing practice of conducting analysis to identify strategic issues and consulting with stakeholders in order to develop the next education strategy. However, discussions about what resources are available and required for successfully implementing the new strategy seem to be limited. While requirements for the European Commission’s Instrument for the Pre-Accession Assistance are part of the planning discussion, budgeting the future education strategy also requires pragmatic conversations and support from the Prime Minister’s Office and the Ministry of Economy and Finance. Albania’s current education strategy has a budget gap of around 43% between the planned budget for 2019 and what was approved (Wort, Pupovci and Ikonomi, 2019[14]). Funding discussions should be more robust to help the Albanian education ministry prioritise goals and actions for system improvement.
Tools for system evaluation are poorly co-ordinated and often unreliable
Albania has established some of the institutions and processes required to gather information and monitor the performance of the education system. However, there are significant challenges in how data is collected and its resultant quality. While Albania introduced a new national assessment of student achievement in Grade 5, it is not yet administered in a way that yields reliable results. Albania’s National Basic Education Examination faces similar reliability concerns (see Chapter 2). This means that the country’s only trustworthy sources of information about student learning come from sample-based international assessments, such as the OECD Programme for International Student Assessments (PISA), and the State Matura Examination that students take at the end of upper secondary school.
Processes to collect administrative data are poorly co-ordinated and out of date
The ministry, through its Monitoring, Priorities and Statistics sector (hereby, the MPS sector), works with regional directorates and local education offices (recently restructured, see Evaluation institutions) and schools to collect administrative data according to a template provided by the Albanian Institute for Statistics (INSTAT). First, schools report their data to local education offices and regional directorates using Microsoft Excel and emails. Then the local education offices and regional directorates compile and share this information with the ministry. These procedures are time-consuming, error-prone and do not allow for real-time monitoring. Albania also faces challenges mapping the codes of national databases to international questionnaires, though the ministry is working with INSTAT to better align national data with international standards set by the United Nations Educational, Scientific and Cultural Organization (UNESCO).
In addition to reporting information to local education offices, regional directorates and the ministry, schools and other education institutions must also provide information to the ESC. This data feeds into one of the ESC’s four educational databases: the student registry for schooling; the student registry for higher education; the database for the State Matura exam; and a registry on the state exam results for regulated (licenced) professions, which includes teachers. These databases help ESC deliver national assessments and exams, for example, by identifying how many test booklets they need to print. However, some parallel data collection exists between the ministry, regional directorates and local education offices and the ESC. All collect data about student GPAs. Additionally, sometimes the ministry, regional directorates, or local education offices, the last of which has direct contact with schools, will request that schools report data on student assessment results, which are available in ESC databases. This duplication can create an unnecessary reporting burden for schools.
Data is not integrated across Albania’s databases
There are no unique student or teacher identifiers within or across Albania’s various education databases. In fact, students are currently assigned different identification numbers for each national assessment and exam they take and this information cannot be linked. The lack of unique identifiers makes it difficult to merge databases, conduct cross‑institutional research, track an individual’s progression through the system and analyse education inputs, processes and outcomes. For instance, while the ministry stores student and teacher demographic data in their central databases, examinations data are located in ESC systems and there is no link between the two systems. As a result, researchers cannot immediately access the data they need to answer important questions, such as what teachers and schools are achieving good student outcomes and which are in need of more support? The problem of data fragmentation and overlap is compounded by the fact that Albania’s Ministry of the Economy and Finance collects and manages all data on vocational education and training. This makes it difficult to collect reliable information about enrolment rates and employment outcomes across different upper secondary programmes.
The Educational Services Centre is developing a new education management information system
In order to upgrade the way in which Albania collects and manages education data, the Educational Services Centre is developing an education management information system, called Socrates, which will introduce a unique student identifier and store information related to students, teachers, curriculum and schools in the pre‑tertiary education system. The unique identifier will follow students from the time they enter the school system until the end of upper secondary school. It is not yet decided if the unique identifier will follow students until the end of higher education.
Socrates is currently in the pilot phase and despite the 2015 deadline for installing the software and infrastructure (as stated in the strategy) the ESC expects that Socrates will only be functional by mid-2020. The EMIS system has the potential to streamline data collection processes in Albania through the creation of a common platform for data entry and standardised practices for schools to input data. However, there is no clear plan to train school officials on how to use the Socrates system and, importantly, the ministry’s MPS sector has not been involved in the planning process. Developing such an important tool in isolation from other key education actors increases the risks that they will bypass the Socrates system and continue collecting their own data in parallel using Excel and email. This is because their needs may not be considered in the design of Socrates and they may not have the skills needed to use the tool effectively.
Public access to education data is improving, but remains limited
The INSTAT Labour Market and Education online data portal provides an easy to use interface that allows individuals to access a limited selection of education indicators, such as student-teacher ratio, field of study and rates of enrolment and graduation. Users can download data as Excel files, analyse it and re-use it. The ministry prepares annual statistical yearbooks with this administrative data and the ESC regularly reports on results from the national assessment and exams. However, data provided by the ministry and the ESC are only available as tables within PDF documents and do not have the same functionalities as the INSTAT data portal. The lack of automated data reporting means that both the government, education agencies and the public must submit written requests for education data to the ministry or manually input data copied from annual reports into statistical software programmes. Currently, statistical data management depends on 1-2 key people in the ministry’s MPS sector, meaning the capacity to respond to data requests is limited and can cause delays for users. At this stage, it is unclear if the Socrates EMIS will establish an interface for different users within the government or make parts of the system available to the public.
The national assessment of student achievement is not standardised
Albania established the Assessment of Primary Education Pupils’ Achievement (VANAF) in 2015-16. Following a pilot study of sample-based national assessments for Grades 3 and 5 in 2014-15, the ministry chose to implement the VANAF as an annual census-based assessment for all students in Grade 5 and not conduct a national assessment for lower grades. The ESC, which manages the assessment, produced an official report that includes a description of the methodology, student results and sample test questions. However, there is no analysis as to whether the assessment’s design was effective nor explanations for why the ministry decided to forgo the grade three assessment. This decision aligns with Albania’s tradition of conducting external census-based assessments at key stages of the education cycle but does not necessarily provide the best approach for supporting system goals to improve student learning. Table 5.2 provides a summary of key information about Albania’s national assessment.
Table 5.2. Key information about Albania’s national assessment, VANAF
Topic |
Summary |
---|---|
Stated purposes |
|
Grade and frequency |
Annual assessment at the end of primary education; compulsory for all students enrolled in Grade 5 |
Subjects |
Albanian language; mathematics; science |
Variables collected |
12 questions in background survey; collects information on gender (male/female), type of school (public/private) and geographic area (urban/rural); no proxy for socio-economic background |
Format |
90-minute paper-based test with 50 multiple-choice and open-ended questions in an integrated format. 22 questions related to the Albanian language (30 points), 18 questions associated with mathematics (20 points) and 10 questions associated with natural science (10 points). |
Marking |
Local education offices mark the test within 15 days and share the results with the ESC. Assessments are marked with a maximum of 60 points, which correspond to six levels of proficiency from insufficient (Level 0) to very high (Level 6). |
Results |
The ESC prepares a public national report each year on student achievement in the VANAF. These reports are shared with regional directorates, local education offices and schools. |
Source: MoESY (2018[13]), OECD Review of Evaluation and Assessment: Country Background Report for Albania; ESC (2017[19]), Vlerësimi i Arritjeve të Nxënësve [Assessment of Students' Achievements of the 5th Grade], The Educational Services Centre, Tirana.
The VANAF does not yield results that are reliable at national level
While it is positive that Albania has a regular national assessment of student learning outcomes, there are some concerns with respect to the VANAF’s design and implementation. In particular, it fails to address an information gap about student learning in the early years of primary school. This prevents the ministry from developing a better understanding about student performance early on, when adjustments to the curriculum and teaching practice could have the greatest impact on learning outcomes throughout school. Marking procedures present another concern since they do not currently ensure comparability across the country. As a result, the ministry cannot use the VANAF as a reliable tool for system monitoring. Albania’s National Basic Education Exam faces similar reliability concerns (see Chapter 2).
As a census-based assessment, the VANAF has the potential to support granular levels of analysis, such as examining differences associated with attending a satellite and/or multi‑grade school versus a more traditional school setting. However, because there is no unique student identifier or reliable EMIS system to link information across different databases, this type of analysis relies on information in the VANAF background questionnaire. Currently, the background questionnaire allows results to be disaggregated by gender (male/female), type of school (public/private) and geographic area (urban/rural) but there is no information to help better understand socio-economic disparities, school settings or other associations that are relevant for policy.
These current challenges prevent the VANAF from yielding reliable evidence about the extent to which students achieve curriculum standards. They also place limitations on the type of research that could support system evaluation and inform policy. Despite these reliability issues, Albania still uses VANAF results as a transparency and accountability tool. In particular, the ESC publishes a national report each year that ranks schools according to aggregate student scores on the VANAF. It also does so without any contextualised information. This is unfair to schools that are located in disadvantaged areas or serve a significant proportion of students who face challenges that affect their learning (see Chapter 4) (OECD, 2013[1]). The national report is the only outreach material prepared for disseminating VANAF results. While these reports are sent to schools, regional directorates and local education offices, they are not tailored to the different audiences and data is only accessible in Word and PDF tables. This discourages secondary analysis and undermines the potential of the assessment to inform policy and practice.
Albania participates increasingly in international assessments
Albania’s ESC is responsible for administering international assessments and producing national reports on PISA results, most recently from PISA 2018 (forthcoming). Participation in PISA 2000, implemented in 2002, allowed for the first international comparison of student learning outcomes in Albania and sparked a national debate on the quality of education and the need for better measures of learning quality, efficiency and accountability in the education system (OECD, 2003[20]). The country has regularly participated in PISA since 2009 and is taking part in the International Association for the Evaluation of Educational Achievement’s TIMSS and PIRLS for the first time in 2019 and 2021 respectively. Despite Albania’s increasing participation in international assessments, the country has not used these instruments to set national benchmarks or goals that could help drive system improvement.
Evaluation and thematic reports
No annual report evaluates the state of the education system
The ministry does not produce an annual report on the performance of the education system. However, most technical agencies prepare annual reports based on their respective programmes of work. The only regular public reporting the ministry provides is an annual statistical yearbook on education, sports and youth. This report monitors trends according to administrative data, such as the number of students enrolled across different levels of education over time, by gender, type of school (public vs. private), geographic area (urban vs. rural) and region. Examples of regular reports from Albania’s technical agencies include the former State Education Inspectorate’s annual report summarising findings gathered through school and thematic external evaluations (see Chapter 4) and ESC reports on results from student assessments and exams. However, these strands of information are not pulled together in a comprehensive report that evaluates the overall state of the Albanian education system. The lack of a regular analytical report makes it difficult to highlight and communicate main system-level challenges and identify policy priorities.
In 2019, UNICEF supported the ministry in conducting an appraisal of the 2014-2020 education strategy. This was the first comprehensive system-level analysis of Albania’s education sector since 2014, just prior to the adoption of the current education strategy. The ministry also plans to start monitoring the 2017-2022 Strategy of Research, Innovation and Science, which outlines the vision, policies and strategic objectives for the development of science, technology and innovation (Republic of Albania, 2017[21]). While these exercises mark an important shift towards more comprehensive system evaluation, they appear ad‑hoc and there are no plans to establish a regular calendar of strategy reviews.
Ad-hoc thematic reports provide some information for system evaluation
Technical agencies within the ministry periodically produce thematic reports that address particular issues within the education system. For example, agencies have published reports on the training needs of principals and teachers (2016), on the perception of teachers and parents towards the quality of the pre-tertiary education reform (2017) and on student achievement in multi-grade classrooms (2017). Many of these ad-hoc thematic reports are available on the websites of the respective agencies that produce them; however, some are prepared for internal use only. While the ministry and its affiliated agencies are responsible for most analytical reports about the education system, they commission external researchers to undertake some of the work. However, members of the research community who met with the OECD review team revealed that these requests were often for ex-post evaluations and were not always used to help inform future decision-making.
Donors and non-governmental organisations have undertaken analysis that support system evaluation
International and non-governmental organisations have contributed to system evaluation in Albania by undertaking valuable analysis. For example, UNESCO published a sector-wide education policy review in 2017 that addressed curriculum, ICTs in education, and policies related to teachers and school leaders. UNICEF has also made important analytical contributions, notably by supporting a pilot national assessment that laid the foundation for the VANAF and conducting an evaluation on the implementation of the new competency‑based curriculum. While the work of external actors can provide important insights for system evaluation, it can also lead the government to pay less attention to developing the technical capacity of national agencies. For example, the ministry’s MPS sector was tasked with conducting regular monitoring and evaluation of the education strategy; however, with only three staff members, external support was needed to make this activity happen.
Evaluation institutions
At the time of this review, Albania did not have a dedicated unit or agency responsible for research and evaluation across the entire education system. However, the Albanian government implemented a reform in March 2019 which established two new institutions, one of which is now responsible for assessing performance of the education system. The two new institutions that were created are:
The Quality Assurance Agency, which merged the functions of the former State Education Inspectorate and the Education Development Institute. This agency is now responsible for setting teaching standards, learning standards, curriculum design and teacher training programmes (the former Education Development Institute mandate), as well as designing and revising the framework for school evaluation (the former State Education Inspectorate’s mandate, see Chapter 4). In addition, the Quality Assurance Agency has a new mandate to monitor the performance of the education system. However, this appears to overlap with the ministry’s MPS sector, which also has some responsibilities for system monitoring.
The General Directorate for Pre-University Education (hereby, the General Directorate), which serves as the ministry’s implementation arm for pre-tertiary education by co-ordinating the work of four regional directorates located in Lezhë, Durrës, Korçë and Fier, as well as local education offices for 51 municipalities. The General Directorate and the regional directorates have identical organisational structures that are responsible for planning school budgets, collecting and processing statistical data, implementing curriculum and standards and providing technical assistance to schools. The local education offices are much smaller bodies that concentrate on curriculum implementation and supporting schools. Importantly, the General Directorate and regional directorates are also responsible for monitoring, evaluating and inspecting schools, using the framework developed by the Quality Assurance Agency.
This re‑organisation presents two main concerns for system evaluation. The first is that merging the responsibility for school inspections with the same body that supports schools threatens to reduce the objectivity of external school evaluations and weaken the quality of the information available on the performance of schools (see Chapter 4). Another concern raised by this new organisational structure is that responsibilities for system evaluation appear fragmented. At present, the ministry’s MPS sector, the new Quality Assurance Agency, and to some extent the new General Directorate all have some remit to help monitor and/or evaluate the education system. Considering Albania’s limited capacity and resources for conducting system evaluation, it is important that these bodies are well co‑ordinated to ensure their roles are complementary and avoid duplication.
Policy issues
The primary challenge to developing system evaluation in Albania is the limited availability of co‑ordinated and high‑quality data. Addressing this requires a modern system for collecting and using data to support evaluation and addressing key data gaps, in particular with respect to student learning. Compared to the majority of European countries in the OECD, Albania’s education data systems are still nascent, partly because of the relatively weak culture of evaluation within the government. As a result, strategies and polices are often set without sufficient analysis, regular monitoring and reporting on progress is limited, and capacity for fulfilling these important functions is relatively weak. Building stronger demand for information and analysis within government, and developing the institutions and procedures to support a culture of system evaluation, will be important to ensure that the data systems recommended by this review are established, utilised and developed meaningfully over time. The development of a new education strategy presents an opportunity for Albania to embed evaluation more centrally in the government’s planning and policy‑making process.
Policy issue 5.1. Establishing the processes and capacity needed to conduct system evaluation
The culture of monitoring, evaluation and research in Albania’s education system is underdeveloped compared to practices in OECD and other European countries. Prior to 2017, there was no agency or unit responsible for monitoring the education system. Today, the ministry’s MPS sector remains poorly staffed with only three individuals responsible for monitoring implementation of the education strategy and managing all official statistical evidence in the sector. At the same time, the newly established Quality Assurance Agency is now responsible for monitoring education system performance but has limited experience with system evaluation and did not receive any additional funding or staff to help fulfil this mandate. The General Directorate has also been tasked with collecting and processing statistical data about the performance of schools but it is unclear how this information will feed into other system monitoring and evaluation efforts.
Without stronger capacity and clearer processes for system evaluation, it will be difficult to promote a culture of regular evaluation and strategic learning within Albania’s education sector.The country needs to stimulate greater demand for evidence and greater investment in its generation, which in turn can help inform strategic planning and policy-making. By disseminating information in ways that are timely, relevant and accessible, Albania can also support greater transparency and public accountability for educational improvement. This can help build trust in the major reforms underway and support for their achievement.
Recommendation 5.1.1. Integrate evaluation processes into the future strategy
The task of developing a new education strategy presents a chance for Albania to integrate evaluation more centrally into the government’s planning and policy-making processes. While the current education strategy was built on an analysis of sector performance and a broad consultation process, this resulted in a long list of aspirations and actions with no clear set of priorities. This is because all of the comments from the consultation process were included in the document without selection and prioritisation or consideration for the cost of each proposed action. Considering Albania’s limited education budget, it is crucial the government direct reform efforts to where they will have the greatest impact. Prioritising strategic issues, setting clear goals and developing implementation plans that are detailed and feasible can help strengthen results-oriented and accountable planning processes. This will require strong system evaluation tools, such as a reliable EMIS (see Policy issue 5.2), that provide reliable and timely data to inform policy decisions, communicate the rationale for these choices and monitor progress.
Prioritise a select number of challenges and goals to focus the education strategy
Conducting an analysis of sector performance and organising a consultation are positive aspects of Albania’s process for developing its education strategy. However, including all of the comments and proposals from the public consultation without prioritising them resulted in a final strategy document that has a long list of 48 expected results, 43 main activities and 194 sub‑activities (MoESY, 2016[15]). Tackling a multitude of challenges at once is not an effective way to identify strategic issues and set balanced goals because the lack of prioritisation makes it difficult to orient actors around a common agenda for improvement. As such, this review recommends that Albania prepare future education strategies by continuing to triangulate information about the education system - drawing on both analysis and consultations - but also by taking decisions about what issues are most pressing and identifying goals and actions to focus the scope of the education strategy. While a consultation on priorities is the best possible way to ensure agreement and make an informed selection, a final decision on the scope and priorities should come from the top-level decision‑makers who are responsible for adopting the strategy (OECD, 2018[22]).
The prioritisation process should reconcile the variety of aspirations different stakeholders may have with what is realistic in the Albanian context, especially in terms of resource, technical and human capacity. This implies creating stronger links between the strategy’s proposals and the state education budget. Albania currently includes a high-level summary budget in its education strategy, which gives the document credibility by identifying the cost for proposed activities. However, since many of the nearly 200 sub-activities require significant financial investments and human capacity, such as reforming the State Matura Examination, resource implications for each of the proposed activities should be carefully considered before they are included in future strategies. This can help the ministry determine if available funding is sufficient for each proposal or if gaps can be reasonably filled to achieve desired outcomes. The Ministry of Economy and Finance and the Prime Minister’s office should be included in the prioritisation exercise to ensure the contents of future education strategies are financially viable and align with broader government goals.
Clearly present evidence to justify the education system’s top challenges
Once a set of priorities have been identified, it is important for the strategy document to clearly present the evidence and reasoning that lay behind the prioritisation process. In the current strategy, Albania included a synthesis of findings from the comprehensive report on the state of the education system that informed the strategy’s contents. However, the synthesis does not systematically highlight evidence to provide a clear understanding of system challenges, why they exist and how acute they are. For example, the strategy identifies the challenge of including children with Roma or Balkan Egyptian backgrounds in kindergarten and preparatory classes (MoESY, 2016[15]). However, the text does not provide any data on enrolment rates by student profile. This may be caused by lack of available data but without clear evidence, the strategy cannot illustrate the extent to which the enrolment of children with these backgrounds is a challenge. Without such information, individuals who are supposed to act on the strategy may not be convinced why this was identified as a strategic challenge for the country.
Strategy documents do not need to include all of the evidence gathered during the system‑level analysis but a selection should be included to help inform subsequent prioritisation processes (OECD, 2018[22]). Albania should ensure that future education strategies use evidence to help clearly express connections between national education challenges, system goals and policy actions. This can be done by reviewing how system challenges are presented in the strategy document and organising evidence in a way that clearly introduces the need for a particular goal or activity. There are several tools that can help support this process and make the presented information more strategic and focused. For example, Namibia presented an analysis of strengths, weaknesses, opportunities and threats (SWOT) and an analysis of political, economic, social, technological and legal factors (PESTL) to set the context for introducing the strategic issues addressed in its national education strategy (see Box 5.1.). A similar approach would help Albania’s next education strategy make a stronger and more targeted case for reform.
Include precise targets in the indicator framework
Once a small set of goals and actions have been agreed on as priorities, the ministry should set up a comprehensive national indicator framework and set realistic targets and milestones for improvement. An important part of this process will be to assess what information sources are currently available and identify where gaps exist that would help support the strategy’s achievement.
Early drafts of Albania’s current education strategy included an annexed indicator framework with measurable targets; however, the space for such targets within the official strategy adopted by the Albanian government is empty. While the draft targets had proposed gradual improvements, it is unlikely these were feasible without a detailed implementation plan and sufficient resources. The removal of targets makes it difficult for various actors to know what they are working towards, measure progress and identify where greater efforts are needed to reach system goals. In the future, Albania should ensure that clear, specific and realistic targets are included within the strategy’s indicator framework to serve as yardsticks for measuring success and support system planning.
Box 5.1. Communicating strategic issues through high-level strategy documents
The Strategic Plan (2017/18 – 2021/22) of the Namibian Ministry of Education, Arts and Culture dedicates a section to carefully explaining the evidence analysis that was performed to determine the country’s strategic issues. In particular, the analysis was based on PESTL and SWOT analysis tools:
Political, economic, social, technological and legal factors (PESTL) analysis is a framework for the analysis of the external environment of the policy in question. It comprises a checklist of areas to be examined when analysing these factors. It is used to determine the external factors that have or will have an enabling or hindering impact on the policy.
Strengths, weaknesses, opportunities and threats (SWOT) analysis is done through brainstorming sessions, workshops or focus groups and involves a wide range of stakeholders and representatives from related organisations. Challenges in SWOT analysis include confusing a strength with an opportunity and a weakness with a threat. This judgement should always be the result of deliberative discussion among stakeholders and no factor should appear under more than one category.
These tools can and are normally used together in order to consider internal and external factors that can have an impact on a determined policy. For example, the impacts of a specific political agenda identified in the PESTL analysis can be translated into opportunities and threats in the SWOT framework. In the case of Namibia, the Strategic Plan includes two short paragraphs that highlight findings from PESTL and SWOT respectively. The resulting strategic issues, found in a separate section of the document, are further summarised into twelve categories, include developing a plan for infrastructure and improving data management.
Sources: Ministry of Education (2017[23]), Strategic Plan 2017/18 – 2021/22, http://www.moe.gov.na/files/downloads/b7b_Ministry%20Strategic%20Plan%202017-2022.pdf (accessed on 18 November 2019); Vági, P. and E. Rimkute (2018[24]), “Toolkit for the preparation, implementation, monitoring, reporting and evaluation of public administration reform and sector strategies”, https://doi.org/10.1787/37e212e6-en..
Develop more detailed implementation plans
Albania’s current education strategy includes a high-level implementation plan that associates each activity with a timeline and points of contact. This practice should be continued in future education strategies as it can help keep the implementation process on schedule and facilitate co-ordination. Despite these positive components, however, Albania’s implementation plan lacks the clarity and detail required to direct action across the sector, in part because it does not systematically translate activities from the main text into clearly sequenced steps for implementation.
The ministry could ameliorate future implementation plans by clearly aligning activities to system goals and providing more specific steps for implementation. For example, if the next strategy includes an emphasis on developing education data infrastructure, as this review recommends, then the implementation plan would need to include not only developing EMIS software but also installing it and training individuals who will use and manage it (see Table 5.3 and Table 5.4). These plans can maintain some flexibility to accommodate the lessons learnt throughout the implementation process but should still identify which actors to involve and hold accountable for developing the data infrastructure, as well as consider what tools and resources these individuals will need to achieve the desired outcomes (Viennet and Pont, 2017[25]). While high-level implementation plans can provide an important framework for the duration of Albania’s education strategies, detailed annual plans would help strengthen the co-ordination of education actors around national strategic priorities and support system planning.
Table 5.3. Example of item from Albania’s current implementation plan
Policy Objective |
Activity |
Sub-activities (products) |
Responsibility |
Timeline |
---|---|---|---|---|
A. Enhance leadership, governance and resource management capacities. |
A.4. Infrastructure for data processing in education is built. |
Develop software and infrastructure for EMIS. |
MoESY RED |
2014-15 |
Analysis and comparison of data. |
MoESY RED Schools |
2015-16 |
||
Reports are published systematically for each audience. |
MoESY |
2016-20 |
Source: MoESY (2016[15]), Strategjisë së Zhvillimit të Arsimit Parauniversitar, për Periudhën 2014–2020 [Strategy on Pre-University Education Development 2014-2020], www.qbz.gov.al (accessed on 16 January 2019).
Table 5.4. Proposal for item in Albania’s future implementation plan
Goal |
Activity |
Sub-activities |
Timeline |
Responsible agents |
Outcome |
|||
---|---|---|---|---|---|---|---|---|
Lead |
Partners |
|||||||
Enhance efficiency of the education system. |
Modernise education management information system (EMIS). |
Develop software and infrastructure for EMIS. |
Years 1 - 2 |
MoESY |
ESC Regional directorales |
A modern and fully operational EMIS. |
||
Install appreciate physical infrastructure for EMIS at the school and ministry level. |
Years 2 - 3 |
MoESY |
Schools |
|||||
Train teams responsible for maintenance, processing and analysis of data. |
Years 3 - 4 |
MoESY |
||||||
Establish protocols for data definitions, collection and retrieval from schools. |
Years 3 - 4 |
MoESY |
INSTAT |
|||||
Implement EMIS and train users on above protocols. |
Years 5 - 6 |
MoESY |
Regional directorates Schools |
|||||
Create quality assurance procedures to verify the accuracy of data. |
Years 5 - 6 |
Supreme State Audit |
MoESY INSTAT |
|||||
Regularly report data for different audiences. |
Years 5 - |
MoESY |
||||||
Create user-friendly interfaces to make data easily accessible for users |
Years 7 - 9 |
MoESY |
Schools Researchers |
Source: Authors.
Recommendation 5.1.2. Develop the capacity to conduct system evaluation
This review recommends that Albania’s future education strategies give central attention to strengthening monitoring and evaluation capacity. This can help develop the country’s nascent tools for system evaluation and generate a greater demand for evidence to inform education policy. Strengthening monitoring and evaluation capacity requires well‑co‑ordinated evaluation bodies that are objective and credible and have sufficient resources and staff with the relevant skills needed to conduct rigorous and reliable analysis. In many OECD countries, a network of agencies and institutions (e.g. government agencies, research institutions, universities) contribute to system evaluation, which can be especially helpful if they have technical expertise in different areas. Moreover, when these agencies are allowed to operate with some degree of autonomy, they can provide an independent voice to help scrutinise education policies and the sector as a whole. In recent years, there has been progress in Albania’s capacity to collect, analyse and use evidence about its education system. However, the country’s overall capacity for evaluation remains underdeveloped and the recent re-organisation of education agencies has led to confusion about the roles different agencies play in monitoring and evaluating system performance.
Albania should strengthen the capacity of the ministry’s MPS sector so that the central administration is able to collate information from across the education system and use it to inform and evaluate policy. At the same time, there is an urgent need to more clearly define the evaluation role of the new Quality Assurance Agency to avoid duplication with other education agencies and the MPS sector. Efforts to develop evaluation capacity should also extend to the General Directorate and its four regional directorates. This is especially important given that regional offices are expected to increasingly support schools, as recommended by this review (see Chapter 4). To do this, regional directorates need to be able to identify the main education issues facing their region, use evidence to set priorities for region-wide professional development and identify which schools require additional support. This will require using a wide range of evidence about the performance of schools, teachers and students under their jurisdiction.
Co-ordinate monitoring and evaluation responsibilities at the central level
Responsibilities for monitoring and evaluating the entire Albanian education system currently cut across different parts of the ministry and education agencies, making it difficult to co-ordinate activities. This review recommends that Albania clearly define the roles and responsibilities for system evaluation, in particular for the ministry’s MPS sector and the new Quality Assurance Agency.
Considering the MPS sector’s proximity to central leadership, this body has the potential to help raise the demand for evidence among high-level policy makers. Examples of the tasks this sector should be responsible for include providing the ministry with information to inform policy decisions on a daily basis and guiding the ministry’s research agenda to evaluate strategic issues or specific policies. The sector could perform research and evaluation work itself, or it could prioritise and commission it from other actors, such as the Quality Assurance Agency or universities. In the future, the ministry should consider making the MPS sector responsible for managing Socrates (see Recommendation 5.2.1). This would help position Socrates as a central, unified source of national education data and ensure that the sector does not continue collecting data through Excel and email in parallel, while the modern EMIS is managed by another agency.
On the other hand, the Quality Assurance Agency should serve as Albania’s technical body for education research and system evaluation. Specifically, the agency should be responsible for regularly reporting on progress towards strategic goals (see Recommendation 5.1.3). For example, the agency could prepare the analytical report on the quality of education that this review recommends. This would help offset some of the MPS sector’s responsibilities, such as conducting regular appraisals of the strategy’s implementation. Placing the responsibility for evaluating and reporting on system goals within a technical agency located outside of the central ministry can help reduce the risk of political influence in evaluation activities while developing education research capacity in general.
Ensure evaluation bodies have the resources needed to achieve their mandates
Albania’s evaluation bodies require adequate financial and human resources in order to conduct system evaluation. Currently, the MPS sector only has three staff members and the Quality Assurance Agency has limited experience in system evaluation. Moreover, while the agency has dedicated departments for system evaluation and quality evaluation respectively, it did not receive sufficient resources and does not have adequate capacity to carry out its new mandate to monitor system performance. Albania’s evaluation bodies require additional resources and reliable multi‑year funding so that political tensions and unstable resources do not limit or take priority over this work. In the case of the MPS sector, additional staff with experience in quantitative and qualitative analysis, use of evidence in policy‑making and delivery of policy would also be important to give the sector more credibility. The sector’s current staff size is inadequate if it is to achieve such a broad and important mandate. By comparison, a similar unit in Romania’s Ministry of National Education has nine staff members (Kitchen, H., 2017[26]) and agencies in some OECD countries have even more. In New Zealand for example, 150 designated review officers work for the country’s Education Review Office, which is responsible for monitoring the performance of New Zealand’s education system (Nusche et al., 2012[27]; ERO, n.d.[28]).
Strengthen regional capacity for evaluation and improve accountability for educational quality
The need for Albania to develop capacity for system evaluation is not limited to the central level. In light of decentralisation reforms, regional directorates are increasingly responsible for ensuring the quality and functioning of schools in their jurisdiction, which requires being able to use a range of evidence (MoESY, 2018[13]). Information on educational quality within these administrative areas is also needed to assess the extent to which regional directorates are meeting national education goals.
Regional directorates are provided with very little technical and analytical support from central government and their capacity to carry out system evaluation is limited. For example, each regional directorate has a sector for human resources and statistics that reports education data to the central ministry. However, this unit does not make comparisons with national averages nor does it systematically use contextual data to better understand the performance of schools and students in their jurisdictions. Previous reviews of Albania’s education system have found that the lack of information sharing, communication and transparency between the national, regional and local levels of education often result in duplication of efforts and implementation gaps (Wort, Pupovci and Ikonomi, 2019[14]; UNESCO, 2017[29]). To keep all actors informed about the state of education in different parts of the country and improve accountability, Albania should support regional capacity for system evaluation by:
Creating an analytical unit within each regional directorate. Once the Socrates EMIS is fully developed, regional directorates will no longer need to employ statisticians to help collect data from schools. Instead, the ministry might create an analytical unit within each of the four regional directorates to transform the role of these statisticians from data collectors to data analysts. In addition to staff with quantitative skills, these new regional analytical units would also need one or two staff members with broad research experience who can manage a regional research agenda and use evidence to inform and help deliver policy. This new unit could be tasked with conducting analysis to identify high-priority needs, evaluating policy approaches and reporting on regional progress to the General Directorate at the central level.
Providing regional analysis on key outcome indicators. The ministry and central education agencies could better support regional directorates by preparing analytical reports that collectively help identify strengths and challenges of a particular region and measure their performance against national education goals. For example, the ESC could prepare individualised reports on VANAF results for each region, showing how the region performs against national average and areas with similar characteristics, as well as disaggregating data within the region.
Developing a regional view in Socrates. The new Socrates system could include a platform where regional directorates can access information and conduct their own analysis to better understand the factors that are influencing the performance of students and schools in their jurisdiction. Staff within the new regional directorate analytical units will need to be trained to use Socrates.
Providing templates for reporting against national targets. To hold regional directorates accountable for their role in providing quality education, the ministry could set regular reporting requirements, such as an annual report or brief on the state of education in each region. The new regional directorate analytical units could be tasked with preparing these reports, which should compare the region against national targets, highlight main challenges and plans to address these. The ministry could provide a template for this reporting requirement that regional directorates could complete. These reports should be shared with the public, in addition to the central ministry.
Recommendation 5.1.3. Report on the quality of education regularly and promote the use of evidence to inform policy-making
Regular reporting on the state of the education system is important to keep policy makers, education practitioners and the general public informed and keep the government accountable for its commitments (OECD, 2013[1]). Different agencies and units in the Albanian ministry publish annual reports on their work. For example, every six months the ministry and education agencies must report to the Prime Minister’s office using a standardised template that includes the priorities, general objectives, indicators, products and frequency in measurement for each actor; however, the completed templates are not available to the public (MoESY, 2018[13]). While these efforts provide valuable sources of information, the different strands are not pulled together on a regular basis to communicate how the education system is performing as a whole. The MPS sector should task the Quality Assurance Agency to publish a regular report on the state of the education system.
In addition to developing the processes and capacity to conduct comprehensive system evaluation, Albania needs to ensure that evidence is available in timely, relevant and accessible forms. This will enable central government and regional education offices to use evidence about the current performance of the system to inform future policy decisions. Sharing evaluation information will also support accountability within the system and to the public. Over time, disseminating quality evaluation information can help the Albanian government and education community become more sophisticated and demanding consumers of evidence.
Regularly publish an analytical report about the education system
Albania does not have a regular analytical report on the quality of its education system. As a partial result of low staff capacity within the ministry and a generally nascent culture of system evaluation, the only regular reporting the central ministry provides the public is the annual statistical yearbook. This report is not analytical and mainly contains descriptive data in tabular form with very few charts or figures. By contrast, most OECD countries publish regularly an analytical report on education (OECD, 2013[1]). Typically, such reports analyse progress against the national indicator framework and explain the strengths and challenges of the system by studying related inputs, processes, outputs and outcomes (OECD, 2013[1]). National policy goals and priorities guide the content of this report.
Albania should include within the new education strategy a commitment to regular evaluation and reporting on system-wide progress, ideally on an annual or biannual basis. Establishing a regular reporting timeframe is crucial because it supports the policy planning process and lets the public know when to expect up-to-date information on government progress or the overall quality of education. This regular report should be a prominent document that goes beyond reporting on the individual work programmes of various education agencies and pulls together different strands of information from internal agencies and external researchers. Such a report could be produced by the Quality Assurance Agency (see Recommendation 5.1.2). In the Czech Republic and Portugal, annual reports not only provide analysis on the state of education but also information about future policies or activities designed to help improve the system (see Box 5.2). Expectations for Albania’s education report could include:
Reporting against key national goals. For example, the annual report might describe progress against short and long-term goals for improving learning outcomes. In Australia for example, the annual report from the Department of Education and Training reports on the system’s results against key performance criteria and targets related to the country’s education outcomes. (Department of Education and Training, 2018[30]).
Analysing progress made. This should take into account why progress may have been quicker or slower than expected in certain areas. For example, when reporting data on student learning outcomes, this kind of analysis would help policy makers understand not only how students perform, but why they perform that way and what can be done to improve performance nationally.
Ensuring the report is easily accessible and publically available. The report needs to be easy to read and download from the ministry’s website. In the future, data included in the report could also be downloadable in a format whereby the research community can easily re-use it to facilitate secondary analysis. This can encourage independent investigation into issues that affect the education system.
Receiving dedicated time for parliamentary debate. Albania should consider giving the Parliament an opportunity to organise committee hearings with the ministry’s senior leadership to discuss the contents of the report. In many OECD countries, this offers an important means to hold the government accountable and can be an effective way to embed the use of evidence in policy-making processes (see below).
Box 5.2. Annual analytical reports on the education system in the Czech Republic and Portugal
In the Czech Republic, the Ministry of Education, Youth and Sports produces an annual report regarding its evaluation of the country’s education system (the Status Report on the Development of the Education System in the Czech Republic). This report relies on a set of indicators designated to assess progress towards the long-term policy objectives of the country. The document summarises the main organisational and legislative changes that occurred in the given year and presents statistical indicators describing the situation and development in pre-primary, basic, secondary and tertiary education. The report also contains information about educational staff in the system, the funding of schools and the labour market situation of school leavers. These data constitute a basis for the development of education policies. Furthermore, the report typically includes an area of specific focus (e.g. the annual report from 2017 includes a section regarding the country’s results in PIRLS 2016). Individual regions within the Czech Republic also produce their own reports to assess progress towards long-term policy objectives.
In Portugal, the National Education Council, an independent advisory body to the Ministry of Education, has published the annual State of Education report since 2010, which provides an analysis of key data on the education system. The first issue, the State of Education 2010 – School Paths, offered a detailed investigation of student pathways in the education system and the latest issue, The State of Education 2017, published in 2018, contains a section dedicated to the state of education in Portugal’s “countryside” and the role of education in promoting territorial cohesion. The report also offers policy advice on how to improve the quality of pre-primary, basic, secondary and tertiary education. It also evaluates policy initiatives, such as changes to school evaluation, human and financial resources and also policies addressed to increasing educational equity.
Source: Santiago et al. (2012[31]), OECD Reviews of Evaluation and Assessment in Education: Portugal 2012, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://doi.org/10.1787/9789264117020-en; Santiago et al. (2012[32]), OECD Reviews of Evaluation and Assessment in Education: Czech Republic 2012, OECD Publishing, Paris, https://doi.org/10.1787/9789264116788-1-en; CNE (2018[33]), Estado da Educação 2017, http://www.cnedu.pt (accessed on 18 November 2019).
Embed the use of evidence in the policy-making process
Research on effective policy-making shows the importance of drawing on evidence to inform decisions both before and after implementation (OECD, 2018[12]). There are many ways in which governments use evidence to inform policies and strategic plans. In particular, randomised control trials (RTCs) are increasingly considered one of the best ways to determine the effectiveness of a given policy. This is because RTCs can help estimate the potential impact of a policy intervention before its introduction and measure its success ex-post. Findings can then be used to adjust or inform policies, help leverage political commitments and serve as a means to promote greater collaboration among different education agencies.
In Albania, as in many OECD countries, these practices do not happen regularly and when they do, external demand rather than actors within the government often drive them. For example, the pilot study of the national assessment, evaluation of curriculum reform and the recent appraisal of the current education strategy were all completed with financial and/or technical support from international actors. This lack of demand for evidence from within the government contributes to, and is partly a result of, having a relatively politicised legislative context and weak public administration (OECD, 2017[34]). Some ways in which Albania could improve the demand for evidence in education policy-making include:
Establishing a guideline that all major policy changes should first be piloted and studied rigorously before full-scale implementation. This could take the form of a Ministerial Order and be implemented by the MPS sector.
Evaluating policies systematically to determine their effectiveness and inform future reforms. For example, Albania might assess effectiveness of the new programme to provide free transportation to students and teachers. The MPS sector could either conduct or commission such analysis.
Ensuring key actors meet frequently to discuss issues (Bryson, 2018[35]). Albania could organise regular meetings with the heads of education agencies and high-level ministry officials to discuss new research and evidence, then collectively decide what actions to take in response. The MPS sector could organise such meetings on behalf of the Minister. Committee hearings with members of the Parliament could also help embed the use of evidence in policy-making processes.
Policy issue 5.2. Modernising the education management information system
Integrated and comprehensive education management information systems are widespread in OECD and European countries. In Albania, current processes for collecting and managing information about the education sector are outdated and not accessible from a unified source that supports the analysis of information from across different databases. Furthermore, only INSTAT publically reports data in an analytical format, meaning users can easily download information to perform their own analysis and generate new insights. Without sharper tools for managing and using data, Albania is likely to continue struggling with system evaluation.
The forthcoming Socrates platform will be the first database in Albania that combines administrative data and learning outcomes in a central source. It will also allow schools to enter data directly into the system, promising to replace the current process of gathering school-level data through emails and Excel files. This represents an important step towards modernising the collection and management of education data in Albania. However, the development and implementation of Socrates has been slow and poorly co-ordinated. To support system evaluation, Albania should make finalising the new EMIS a priority and establish it as the central source of information about the country’s education system. In working towards the development and completion of the new EMIS, Albania should also build in analytical and reporting functions to ensure that Socrates becomes an effective and useful tool for system evaluation.
Recommendation 5.2.1. Address gaps in the development of Socrates and establish it as the central source of education data
The Socrates system is an excellent opportunity for Albania to modernise the collection, management and use of education data. Nevertheless, there are important gaps in current plans for Socrates’s development, such as the lack of protocols for defining, collecting and verifying data. Moreover, Albania will need to build the staff capacity to manage the EMIS once it is finalised and establish it as the official go-to source of information for all stakeholders. These efforts are paramount to providing the quantitative information needed to improve system evaluation in Albania.
Establish protocols for data definition, collection and retrieval from schools
To streamline data collection processes in Albania, the ESC plans to train and certify 1-2 individuals within each school to enter data into the Socrates system through a password‑protected portal. It will be important that all schools are able to connect to the Internet in order to access the portal and submit data. While directly submitting data reduces the risk of introducing human error, which might occur when school-level information is aggregated manually, Albania does not have common data standards to ensure that all schools have a shared understanding of data definitions. The result is that schools might report indicators or data points in different ways. Accounting for school facilities (e.g. satellite schools) and multi-grade schools is already problematic for current data collection and could intensify when schools start entering data directly into the database. Albania also lacks clear protocols about who can request information from schools. This presents a risk that once implemented, different actors and agencies might bypass the Socrates system and continue to collect school‑level data in parallel using email and Excel files.
Many countries have established strict protocols regarding how to define data points and who can retrieve information from schools. For example, the United States uses common data standards to ensure that schools enter information directly into databases consistently (see Box 5.3). In Albania, a formal data dictionary and sharing protocol would provide schools with guidance on how to define data and give them the mandate to reject external information requests, thus encouraging both governmental and peripheral requestors to turn to Socrates for their desired information. These protocols could support the training sessions the ESC plans to organise for schools in preparation for the implementation of Socrates. Finally, ensuring that data definitions for Socrates are consistent with international definitions would make it easier to report data internationally.
Box 5.3. Common Education Data Standards in the United States
In the United States, the Common Education Data Standards (CEDS) are part of a national initiative to develop a common language for education data across states, districts, and education organisations, allowing different actors to exchange and compare data in a more efficient and accurate way. The standards were developed in 2009 by the National Center for Education Statistics and a CEDS stakeholder group, which included representatives from different levels of government, higher education agencies, early childhood organisations, and others relevant stakeholders. Since its 6th version, the common standards have been developed and maintained online by an open community of users.
An increasing number of stakeholders have benefited from CEDS. Federal agencies, for example, have used CEDS to improve the quality and accuracy of their data and align their different data collection systems. Having common data definitions across the system has not only improved the accuracy with which information is transferred across the education sector, from early childhood to the workforce, but also across other sectors, such as social services or health at the local, state and national levels. The standards also provide educators, researchers and policymakers with a clear understanding of what the data mean so they can work together to improve programmes and outcomes for students.
Sources: CEDS (n.d.[36]), Why CEDS?, Common Education Data Standards, https://ceds.ed.gov/pdf/why-ceds.pdf (accessed on 23 October 2019); CEDS (n.d.[37]), Frequently Asked Questions, CEDS, https://ceds.ed.gov/FAQ.aspx (accessed on 23 October 2019).
Create quality assurance procedures to verify the accuracy of data entered
To ensure that data is of the highest quality, many countries implement strict data validation and auditing procedures (Abdul-Hamid, 2014[10]). While Albania’s ministry, regional directorates and local education offices seem to conduct ad-hoc validations, for example by investigating discrepancies in the data they receive, these procedures are not systematised.
Establishing regular quality assurance mechanisms for EMIS data, such as visiting a sample of schools to check if independent data collection aligns with the school’s data collection, and if the school’s data collection aligns with the information they input into EMIS (Mclaughlin et al., 2017[36]), would help ensure that data protocols are effective and accurate. These procedures can increase the level of trust in the Socrates system, which is especially important in light of concerns that stakeholders will likely bypass the new EMIS and continue collecting and processing data using email and Excel.
A central government body should undertake the role of verifying the accuracy of data entered into Socrates to ensure consistency at the national level. One option could be for the independent Supreme State Audit (Kontrolli i Lartë i Shtetit) office to take on this role since it has a broad mandate to promote accountability across the public sector (Albania Supreme State Audit, n.d.[37]). Another would be to create a small team within the new Quality Assurance Agency, with a specific mandate for the quality assurance of EMIS data.
Raise the prominence of Socrates by positioning it closer to central leadership
Socrates is currently being developed by the ESC, which relative to the central ministry has better infrastructure and greater technical capacity to develop a modern EMIS. However, the ministry’s MPS sector, which manages official administrative data and prepares the annual education statistical report, has not been involved in designing Socrates. Developing such a critical tool in isolation from central leadership increases the risk that Socrates will be under-utilised and that different actors will continue to collect and manage their own data. Once ESC has overseen the full development and introduction of Socrates, they should gradually prepare the ministry to take ownership of the platform.
Many education systems place responsibility for collecting, processing and disseminating education statistics within the central ministry. For example, the Romanian education ministry has a special unit within its Public Policy Department that is responsible for managing the country’s Integrated Information System of Education. Serbia’s Unified Information System of Education is also housed in the central ministry. Albania’s MPS sector would be a logical place to store Socrates in the long term, reinforcing its mandate to monitor the education system. The MPS sector’s location under the ministry would also raise the prominence of this tool and could facilitate a greater demand for evidence in national education debates, ensuring that Socrates develops into a responsive tool that meets the data needs of policy makers. Importantly, the responsibility for managing Socrates should be clearly defined and communicated to all agencies and actors who will feed data into the system.
Build staff capacity to implement Socrates
The MPS sector will need increased technical staff capacity if it is to become responsible for the management and future development of Socrates. Currently, the Albanian MPS sector manages official administrative data with support from three individuals, only two of which are responsible for statistics. This makes it difficult for the sector to collaborate with the ESC to help finalise the development of Socrates, let alone take responsibility for establishing the platform as an effective tool for system evaluation. To make Socrates the central source of education data, Albania will need to build the capacity of technical staff and key actors across the system, in particular within the ministry. Primarily, this involves having enough people with the right skills to manage and use a fully functioning EMIS system. For example, the EMIS in Georgia employs five statisticians solely for responding to data and research requests, in addition to department leadership, administrative support and software developers who manage the system ( (Li et al., 2019[38])).
Albania would benefit from employing additional staff within the ministry’s MPS sector who could help liaise with the ESC and Bit Media e-Learning Solutions (the company developing the EMIS infrastructure). These individuals could also help make small fixes once the system is implemented. Specific capacities that need to be recruited or developed include software development for maintaining and improving the EMIS and quantitative analysis skills for processing data and creating thematic reports. Staff should also have training opportunities to develop their technical skills and keep up-to-date with changes in the EMIS, user needs and changing technologies (Abdul-Hamid, 2014[10]).
Recommendation 5.2.2. Develop Socrates into a functional tool to inform decision-making
Effective EMIS systems incorporate features that allow for strong analysis and reporting that can aid research and inform policy-making (Villanueva, 2003[39]). While current plans for developing Socrates include some important innovations, in particular by creating a unique student identifier that will help link databases and facilitate multi-dimensional analysis, more could be done to support dynamic analysis and generate reports that inform education policy.
For example, Albania could build different interfaces into the design of the Socrates system to better support analytical and reporting functions that can be used by policy makers, researchers and schools. Using a national identification number could also enhance Socrates’s functionality as a system evaluation tool.
Create a user-friendly interface to make education data easily accessible
With the exception of a limited amount of information available through the INSTAT data portal, Albania has no user-friendly interface that allows individuals within and external to the government to explore different sources of information about the education system. Instead, users can access national education data by written request to either the ministry or relevant technical agency or by reviewing the regular publications, such as the annual statistics yearbook on education, sports and youth. However, these publications only provide data in PDF tables and there is no way to link various sources of information together. This situation makes it difficult to raise the demand for data among different users and to conduct independent system analysis. Albania should modernise the way in which it disseminates information about the education system to different audiences.
Real-time access to data through a web portal is an increasingly common way to extract information from EMIS databases and present it in an accessible manner (Abdul-Hamid, 2017[40]). For example, the ministry should create a public dashboard containing data about the performance of the education system for general users. Within the government, a private dashboard with school-level information, such as how many students attend and performance on national assessments, could also be developed for the central ministry, technical agencies, regional directorates, local education offices and schools. The school dashboard should be an internal tool that replaces the public school performance card (see Chapter 4), which would remove the risk of unfairly ranking schools while still providing the information needed for planning purposes and to identify and disseminate good practices. Both dashboards should automatically populate information from data stored in Socrates and facilitate customised comparisons in a contextualised manner and at different aggregate levels (e.g. regional or national). The dashboards could also include data visualisation features that allow users to generate charts and figures or export data for further analysis.
Consider using a national identification number to link data across agencies
One of the most noteworthy innovations Socrates will introduce is a unique identifier that will integrate various education databases. Using unique identifiers will enhance the analytical functions of education data in Albania and provide insights to support national education goals. However, there are a few design elements the ministry should reconsider as it finalises the development of Socrates. In particular, Albania should use a national identification number rather than creating a new one just for the education system. There are several advantages to this approach. First, a civil identification number will follow a standard structure across all education databases, including vocational education and training, higher education, etc. Moreover, because it exists nationally, a civil identification number can be used to conduct research across different sectors (e.g. if one wishes to study education outcomes and labour market success). Finally, by using a civil identification number, much student information can be retrieved automatically by linking the EMIS with the national registry. This greatly improves data quality and reduces the data entry burden on schools. Of course, using civil identification numbers requires protocols about who can access data, how they can access it and when data should be anonymised to protect student privacy.
Box 5.4. The Estonian Education Information System
Estonia is known for having a sophisticated digital civil registration system. Most of the country’s public services are available online and even voting can be done through a secure digital identification process that is available for all citizens via their national identity card, a mandatory document that establishes a person’s identity. This personal identification system is also used in the education sector. As a result, the web-based Estonian Education Information System (EHIS) is able to link all education databases with each other and with over 20 different information systems in the country, such as the Population Register (used for example, to calculate the number of out-of-school children) and the Estonian Examination Information System.
The EHIS follows clear guidelines about how information can be accessed and presented, which helps protect personal and statistical data from being misused. In particular, access to the EHIS requires registered authorisation. Only individuals performing a duty prescribed by law and which requires information from the database are able to access personal information about students, teachers and school staff. To obtain approval, individuals must submit a written application to the Ministry of Education, setting out what data they require and how they intend to use it. These features enable the EHIS to serve as an important tool for monitoring and guiding policies in Estonia’s education system.
Sources: e-Estonia (n.d.[41]), Education, e-Estonia, https://e-estonia.com/solutions/education/ (accessed on 5 November 2019); EESTi (n.d.[42]), Education and Science, EESTi Gateway to e-Estonia, https://www.eesti.ee/eng/services/citizen/haridus_ja_teadus/isikukaart_eesti_ee_portaali (accessed on 5 November 2019); The European Agency for Special Needs and Inclusive Education (2019[43]), Estonia Background Information, https://www.european-agency.org/data/estonia/background-info (accessed on 18 November 2019); Lao-Peetersoo (2014[44]), Introduction of Estonian Education Information System (EHIS), http://www.oecd.org/education/ceri/Birgit%20Lao-Peetersoo_Introduction%20to%20the%20Estonian%20Education%20Information%20System%20EHIS.pdf (accessed on 18 November, 2019); Abdul-Hamid (2017[40]), Data for Learning: Building a Smart Education Data System, https://doi.org/10.1596/978-1-4648-1099-2.
Recommendation 5.2.3. Develop the national indicator framework to guide the development of Socrates
Albania’s current education strategy includes a national indicator framework that identifies data sources related to pre-tertiary education. However, there are currently no indicators related to student learning and some indicators are not clearly defined. Moreover, targets are not systematically aligned between the main text of the strategy and the indicator framework. For example, having “95% of three- to five‑year‑olds in pre‑school education” is a clear and measurable target referenced in the main text of the strategy but is not included in the indicator framework. Albania needs to develop a national indicator framework that aligns with the education strategy and draws on information from across the system, especially student learning outcomes. Mapping the indicator framework against available sources of information can help identify information gaps and signal a need for Socrates to improve data collection in order to better measure progress. This can also help improve accountability for system performance and co-ordinate policy efforts.
Introduce indicators and targets that focus on student learning
The current strategy’s indicator framework narrowly focuses on inputs and outputs. For example, under the “quality and inclusive learning” objective, indicators include the number of schools equipped with ICT labs and the number of students who leave school early but there are no references to learning outcomes. Considering the large share of students in Albania who do not master basic competencies, including an indicator based on outcomes could be a better way to help measure this objective. While input and output indicators are appropriate measures for some parts of the indicator framework, this review strongly recommends that Albania establish precise outcome indicators on student learning and associate these with achievable targets. This can help ensure that different stakeholders recognise learning as a national priority and keep policy makers accountable for improving student outcomes. The ministry could, for example, consider introducing targets on:
Reducing the share of students not reaching minimum competency levels in numeracy and literacy. In the short term, since Albania’s national assessment does not enable reliable system-level comparisons, data from international assessments, such as TIMSS, PIRLS and PISA, can serve as indicators to monitor student performance. For example, setting a target to reduce the share of low performers (below PISA Level 2) to less than 15% by 2030 would align with the EU and UN focus on ensuring all students acquire basic skills (European Commission, 2018[45]). In the medium term, the VANAF should also be used to track this target.
Reducing the gap in student learning between population sub-groups. Considering the evidence of educational inequalities across different groups of students (see Chapter 1), it is promising that Albania disaggregates enrolment rates by gender, disability and ethnic minority population (Roma and Balkan Egyptian). However, the national indicator framework should include information on student performance with the goal of reducing achievement gaps among student groups and between rural and urban areas. This information could be collected through the background questionnaires of Albania’s national assessments and exams.
These indicators and targets require reliable national and international assessment tools to tack progress over time. Albania should strengthen its national assessment (see Policy issue 5.3) and data collection tools to capture a wider range of contextual information about student background, such as socio-economic status, national minority group (e.g. ethnic minority populations, non-native Albanian speakers) and special education needs. This would support in-depth analysis and strategic planning to address equity issues.
Use the national indicator framework to prioritise data collection for Socrates
In addition to monitoring student learning outcomes, the ministry could develop a stable national indicator framework to identify data gaps to help orient the future development of Socrates. If, for example, the ministry sets a system goal to improve the retention of students from ethnic minority groups, the national indicator framework would indicate data on student ethnicity needs to be collected and added to Socrates. If Socrates currently does not hold such data, or if such data is poorly collected, EMIS staff would prioritise developing capacity and data collection procedures to support the monitoring of this indicator. The indicator framework could also incorporate data related to student achievement, the teaching profession and school performance, which can provide valuable insights for system evaluation. Reporting against indicators from the framework in an education report would also support public accountability and create pressure to ensure that any data gaps are addressed.
Policy issue 5.3. Ensuring the national assessment supports system goals
Albania introduced the VANAF national assessment to support system monitoring, implement the new curriculum and help improve student learning. However, the lack of standardised marking and moderation processes mean the results are not comparable at the system level. As a result, Albania currently does not have a reliable external measure of learning outcomes until students take the PISA assessment at age 15. While data from TIMSS and PIRLS will soon be available to help monitor learning outcomes in earlier grades, international assessments cannot measure how well students are meeting the national curriculum standards. This information is not available until students take the State Matura Examination in Grade 12. As a priority, Albania should align the design and implementation of its national assessment system in order to better support system goals.
Once a reliable assessment instrument has been established, the ministry will need to work with the ESC to improve the way in which assessment results are disseminated. This is especially important because Albania’s national assessment is census-based. While census assessments are more expensive to implement than sample-based ones, research shows that having an externally validated measure of learning for each student can help identify and address achievement gaps and act as a reference for teachers’ classroom marking (OECD, 2013[1]). However, Albania’s current practice of producing a single national report that ranks schools based on their aggregate results does not support these broader functions. Improving the dissemination of national assessment results by making relevant and contextualised comparisons and creating reporting structures that target the different interests of students, parents, teachers and schools can help Albania leverage the potential of this important evaluation tool.
Recommendation 5.3.1. Align the national assessment with its stated purpose of system monitoring
Albania has clearly defined the purpose of each assessment that students take along their school trajectory. The stated purpose of the VANAF is to help upgrade the skills, knowledge and know-how of students; monitor and control the implementation of the curriculum; and inform students, parents and educational institutions about student achievements (MoESY, 2018[13]). While these objectives are in line with the purpose of national assessments in many OECD and EU countries, the current design and marking of the VANAF do not support such a broad purpose. In particular, the lack of standardised marking and moderation processes mean the results cannot be compared nationally. This undermines the VANAF’s reliability as a system-monitoring tool. Moreover, the VANAF has potential as a census-based assessment to serve a formative function by helping schools and teachers to identify and support struggling students. However, the decision not to implement the national assessment in Grade 3 represents a missed opportunity to identify and address achievement gaps earlier, before they become problematic. This section discusses some of the changes that Albania could consider to more closely align the VANAF with its joint purpose of serving as a system-monitoring tool and a formative resource for teachers and schools. The analysis is guided by a set of key considerations, outlined in Table 5.5, which any country needs to review when determining the design of a national assessment.
Table 5.5. Key decisions regarding national assessments
Topic |
Options |
Advantages |
Disadvantages |
---|---|---|---|
Subjects |
Many |
Broader coverage of skills assessed |
More expensive to develop, not all students might be prepared to take all subjects |
Few |
Cheaper to develop, subjects are generalisable to a larger student population |
More limited coverage of skills assessed |
|
Target population |
Sample |
Cheaper and faster to implement |
Results can only be produced at high, aggregate levels |
Census |
Results can be produced for individual students and schools |
More expensive and slower to implement |
|
Grade level |
Lower |
Skills can be diagnosed and improved at an early stage of education |
The length of the assessment and the types of questions that can be asked are limited |
Upper |
More flexibility with respect to the length of the assessment and the types of questions that are asked |
Skills cannot be evaluated until students are in later stages of education |
|
Scoring type |
Criterion-referenced |
Results are comparable across different administration |
Results require expertise to scale and are difficult to interpret |
Norm-referenced |
Results are easier to scale and interpret |
Results are only comparable within one administration of the assessment |
|
Item type |
Closed-ended |
Cheaper and faster to implement, items are more accurately marked |
Can only measure a limited number of skills |
Open-ended |
A broader set of skills can be measured |
More expensive and slower to implement, marking is more subjective in nature |
|
Testing mode |
Paper |
The processes are already in place and the country is familiar with them, requires no additional capital investment |
Results are produced more slowly, seen as more old-fashioned |
Computer |
Results are produced more quickly, more cost-effective in the long-term, seen as more modern |
New processes have to be developed and communicated, requires significant initial capital investment |
Source: Adapted from DFID (2011[46]), National and international assessment of student achievement: a DFID practice paper, https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/67619/nat-int-assess-stdnt-ach.pdf (accessed on 13 July 2018); OECD (2011[47]), Education at a Glance 2011: OECD indicators, https://doi.org/10.1787/eag-2011-en.
Review the target population and grade levels
Considering Albania’s national assessment has stated purposes to support system monitoring and improve the “skills, knowledge and know-how of students,” this review recommends that the ministry reconsider its decision not to pursue an assessment of student learning in an earlier grade. This is because the consolidation of a child’s cognitive skills in the early years of primary school is essential for their future learning. For this reason, many OECD countries assess student learning in at least one grade of primary school to identify issues in students’ learning before they become problematic and track progress overtime. Waiting until the end of primary school (in Grade 5 when students are around 10- or 11-years-old) to reliably collect external data about student learning is too late if the instrument intends to sever a formative purpose by helping teachers and schools identify and address learning gaps. This review recommends the Albanian ministry maintain the VANAF in Grade 5 but also introduce a census-based national assessment in Grade 3. This configuration of Albania’s assessment framework would provide information on student learning to support system monitoring across different stages of the pre-tertiary education system (see Table 5.6).
Introduce a census-based national assessment in Grade 3. A full-cohort national assessment in Grade 3 would provide valuable insights about student learning at a stage of education where there is currently no comparable information about the extent to which students in Albania are meeting curriculum standards. A census assessment in this grade would give students time to adjust to formal schooling but still give schools and teachers a chance to address achievement gaps early on. The design of the Grade 3 assessment should be appropriate for young learners and could incorporate test items that link the assessment to the Grade 5 level. This type of vertical linking or scaling would show student progress over time against a common measurement scale across the third and fifth grades.
Maintain a census-based national assessment in Grade 5. This review supports maintaining the VANAF as a census-based national assessment in Grade 5 since this marks the end of primary school and provides data to help better understand the quality of learning that takes place during this stage of the education cycle.
While census-based assessments can serve as accountability measures, it is important to communicate to parents, teachers and other actors that Albania’s national assessment is for system monitoring and formative purposes only. In neighbouring North Macedonia, for example, the government had to eliminate its national assessment in 2017 because it was being used unfairly to determine teacher salaries based on how closely classroom marks corresponded to students’ national assessment results (OECD, 2019[48]). While the VANAF can serve a formative function by giving teachers an external reference point to help moderate or benchmark their classroom assessment, attaching high stakes to the results should be avoided. This is a concern in Albania since the ESC, regional directorates and local education offices rank schools according to average scores. Improving the way in which results are disseminated and used can help reduce the risk of negative consequences for students, teachers and schools (see Recommendation 5.3.2).
Table 5.6. Proposal for national assessment framework in Albania
Grades |
Assessment |
Frequency |
Population |
Subjects |
Primary purposes |
|
---|---|---|---|---|---|---|
Grade 3 |
National assessment |
1 or 2 year cycle |
Census |
Mathematics and Albanian language |
System monitoring and formative |
|
Grade 4 |
TIMSS/PIRLS (international assessments)* |
4/5 year cycle* |
Sample |
Mathematics, science and reading |
System monitoring |
|
Grade 5 |
National assessment (VANAF) |
Annual |
Census |
Mathematics, science and Albanian language |
System monitoring and formative |
|
Grade 9 |
National Basic Education Examination PISA (international assessment) |
Annual 3 year cycle |
Census Sample |
Mathematics, Albanian language (or mother tongue) and foreign language** Mathematics, science, reading |
Student certification System monitoring |
|
Grade 12 (depends on programme) |
State Matura Examination |
Annual |
Census |
Mathematics, Albanian language and literature, foreign language and one elective (see Chapter 2) |
Student selection and certification |
Note: This table is based on recommendations from across this review. It aggregates proposed, current and planned sources of information on student learning that can be used for system monitoring and formative use in schools.
*Albania will participate in TIMSS and PIRLS for the first time in 2019 and 2021, respectively.
**National minority language students are assessed in mother tongue, Albanian language and mathematics on the National Basic Education Examination. They also have an option to take a foreign language subject.
Source: Authors; MoESY (2018[13]), Country Background Report: Albania, Ministry of Education, Sports and Youth, Tirana.
Standardise marking of the national assessments
Reliability is an essential feature of assessments used to monitor learning outcomes at the system level and over time. Highly reliable assessments ensure that particular assessors and marking procedures do not influence the results of the test (OECD, 2013[1]). In Albania, the content of the VANAF is standardised but the marking does not meet high‑quality standards. Schools are responsible for administering the national assessment but marking is done by local education offices, which often struggle to attract the most experienced teachers to do this work. The main reasons for this have been that teachers do not receive remuneration for marking the national assessment and this responsibility is not recognised within the teaching career structure. There are also are no moderation processes to validate the consistency of marking. As a result, the VANAF does not provide accurate information that can highlight differences in student learning across the country.
This review strongly recommends that Albania introduce the same rigorous external marking or moderation procedures for its national assessment as is standard practice in most OECD countries. Concretely, this means transferring the responsibility of marking the VANAF Grade 5 and the future Grade 3 assessment to the ESC, which is investing in the infrastructure to mark State Matura exams electronically. This technology could also be used to mark the national assessments. For questions that cannot be evaluated using technology, assessors should be selected and certified according to strict criteria and their marking moderated (e.g. joint marking, sampled second marking). The ministry should consider introducing incentives for individuals to take on these additional responsibilities, which could help ensure that both the delivery and marking of these tests meet high‑quality standards. Such changes will require additional resources and capacity for the ESC.
Consider electronic marking and moving to computer-based delivery in the future
The use of computers to administer national assessments is becoming more common, particularly in countries that introduced a national assessment recently (OECD, 2013[1]). Compared to paper-based delivery, computer-based testing has several advantages. It tends to be cheaper to administer (aside from the initial capital investment), delivers results more quickly and is less prone to human error and integrity breaches. Albania has begun marking the State Matura exam electronically and should use this technology to mark national assessments as well, which would help standardise the marking process. As the country endeavours to modernise its education system, the ministry and the ESC might also consider delivering national assessments using computers in the future. However, this will require significant financial investments and should be seen as a long-term goal. Albania’s experience administering the PISA 2018 test via computers was a major challenge, and the ESC had to loan computers to schools in order for students to take the test. PISA is a sample-based assessment so the prospect of delivering the VANAF or Grade 3 assessment to a full cohort of students seems unrealistic in the immediate future.
As such, Albania should focus on improving and establishing reliable paper-based assessment instruments in Grades 3 and 5 that can be marked electronically using a combination of automatic marking and human on-screen marking to score tests. Then, when resources allow, a digital assessment that mimics the paper version could be established. Research should be conducted to compare results between the two delivery methods at this stage. Finally, the ministry and the ESC should raise awareness to prepare schools and students for implementing an entirely digital national assessment before the instrument is fully introduced.
Maintain focus on foundation skills but consider adding items to the Grade 5 subject tests
Focusing the national assessment on a limited number of subjects can generate data to help strengthen the foundational skills of students. Albania’s current national assessment in Grade 5 tests students in mathematics, Albanian language and science using an integrated test of 50 questions to cover the curriculum (only 10 of which are science questions). The total score of the integrated test may provide a sufficiently reliable estimate of student achievement if performance on different subject tests is positively correlated. However, the current number of questions for each subject measured by the VANAF is unlikely to fully measure all domains of the Grade 5 maths, reading and science curriculum. As such, Albania should consider extending the number of items for each subject test in the Grade 5 national assessment to help collect more reliable information about how well students meet curriculum standards. Another way to increase the curriculum coverage of the national assessment is to use a matrix sampling method, whereby different content is included in different sets of test booklets (OECD, 2013[1]). This approach allows for broader coverage of the curriculum without increasing testing time but would undermine the diagnostic value of the census-based instrument since students would not be tested in the same material.
The subjects assessed in the Grade 3 assessment should also focus on foundational skills, namely numeracy and literacy. There is no need for a science assessment at this stage because the students are very young and will have limited subject knowledge.
Set a realistic timetable to introduce national assessments
This review recommends that Albania work toward establishing a census-based assessment in Grades 3 and 5 on an annual basis. While this may not be immediately feasible considering the significant costs and staff capacity required to conduct annual census-based assessments, Albania already conducts the VANAF Grade 5 assessment annually and it is unclear why the ministry decided to forgo the assessment in Grade 3. Albania should use results from the 2014-15 pilot study to inform a realistic timetable for introducing the Grade 3 assessment. The implementation plan should allow adequate time to prepare education actors and the public for the new assessment, which can help avoid the perception that it will have stakes for students, teachers and schools. To reduce the pressure on resources and staff capacity in the short term, Albania could gradually introduce the Grade 3 assessment on a rotating or alternating basis with Grade 5. Many countries use this approach to implement their national assessments (OECD, 2013[1]). However, this should be part of a longer term system monitoring goal to generate reliable and timely information about learning outcomes in both grades on an annual basis.
Collect more contextual information about factors that impact student learning
Albania participates in international assessments that include background questionnaires for teachers, school principals, students and parents. However, the set of proxies used to capture factors that influence student learning in national assessments are limited. The VANAF currently has a 12-question student background questionnaire that allows data to be disaggregated by gender, type of school (public/private) and geographic area (urban/rural), but there are no proxies for socio-economic status. This review recommends that Albania review the structure and content of the VANAF background questionnaire to ensure it responds to research questions that can help inform education policy. For example, considering Albania’s education strategy aims to improve inclusion, the country could revise its own background questionnaires to collect more robust information on student background, such as parental education level (a proxy for socio-economic status), special education needs, and national minority group (e.g. ethnic minority populations, non-native Albanian speakers). Teacher and school questionnaires could also provide insights on some of the contextual factors that influence student learning, such as whether results represent a satellite or multi-grade school. Albania could use questionnaires from TIMSS and PISA as models to adapt and enhance their own background questionnaires.
Recommendation 5.3.2. Improve the dissemination of national assessment results to support system goals
Albania chose to develop a census-based national assessment in order to monitor the learning of all students. Considering the resource demands related to this type of assessment and in order to ensure that results can help improve teaching and learning, it is critical to optimise the national assessment for interested parties by communicating findings in an appropriate form (Kellaghan, Grenaney and Murray, 2009[49]). While strengthening the reliability and design of the VANAF should be Albania’s top priority, the ministry and ESC should also consider how to most effectively report the results from the national assessments in Grades 3 and 5. This must be done with caution to avoid potentially negative consequences, such as attaching stakes to the assessment.
In Albania, the ESC currently prepares an annual national report on VANAF results. This includes several components that are essential for disseminating the findings of national assessments. Namely, the report sets the context of the assessment by highlighting its relevance for policy objectives and the framework for its design and methodology. Next, it provides a description of achievement results, trend data and correlations by gender, school type and geographic location. However, the report also ranks schools according to aggregate student scores without any contextualised information, which is unfair and unreliable since the assessment is not marked in a way that allows for comparisons across the country. Importantly, this is the only tool used to communicate VANAF results with the public and data is only accessible in PDF tables. To optimise the potential of the VANAF, the ministry and the ESC should create tailored reports that target different audiences, such as parents, teachers, schools and the general public. This could leverage the assessment’s potential to inform education policy and help achieve national learning goals. However, the ministry and the ESC should avoid using decontextualised results as an accountability measure, as this could have negative consequences.
Keep the national report but identify different benchmarks for comparisons
Census-based testing generates data that allows schools to compare their average performance with other schools. This level of comparison might not be the most relevant since this approach often results in schools with the greatest concentration of students from more advantaged backgrounds continually being considered the most effective. It also undermines the potential formative function of these assessments. Instead of limiting the unit of analysis to individual schools, several different benchmarks can be identified against which schools can compare themselves more meaningfully (Kellaghan, Grenaney and Murray, 2009[49]). For example, it would be more appropriate to compare a school’s national assessment results to other schools that are located in the same regional directorate (see Recommendation 5.1.2), or have similar student populations (i.e. students with similar socio-economic backgrounds) or structures (i.e. comparing multi-grade schools with each other). Aggregate averages of schools from these categories can allow individual schools to measure themselves against more relevant performance benchmarks.
Based on the body of international evidence and the education environment in Albania, this review recommends that schools are no longer ranked solely according to their average results in the VANAF report. Instead, information about school-level national assessment results should be presented alongside contextualised information about the school and student population. This contextualised information can be included in the national report but also in the school performance dashboard of Socrates (see Recommendation 5.2.2).
Create reporting structures that maximise the formative value of national assessment
Census-based testing enables the ESC to generate reports at several different levels of the education system (OECD, 2013[1]). In addition to the national report of assessment results for system monitoring, the ESC could create different reports for diverse audiences to align Albania’s national assessment with its stated purpose of informing a range of stakeholders about student achievement. Developing and disseminating these reports will require additional resources and capacity so central planning should budget this work accordingly.
Student reports. These should compare a student’s performance to national, municipal and other relevant benchmarks. Since one of the main purposes of the current assessments is to provide reliable feedback about student learning to schools and teachers, care should be taken concerning how results are provided to students and parents to avoid the perception that the results carry stakes.
Students and parents should be informed about individual student results as part of the regular parent-teacher meetings. Teachers might be provided national guidance on how to present the results. For example, teachers might discuss the results within broad categories of meeting or not meeting national expectations, rather than focusing solely on specific scores.
Reports for teachers. These should contain item-level analysis with information about how their students performed on each item and the competencies those items assessed. This information should be presented alongside contextualised comparison groups, such as gender and municipalities. To further support the assessment’s formative function, the results might also analyse common errors that students made, with suggestions on how to improve teaching of that content.
School-level reports. These should present the performance of an individual school with relevant benchmarks for comparisons. For example, a school report might compare performance to the national and regional averages or with schools operating in similar contexts. Importantly, care should be taken to ensure that school-level performance on the national assessment supports information transparency and does not become a narrow accountability measure.
Table of recommendations
Policy issue |
Recommendation |
Action item |
5.1. Establishing the processes and capacity needed to conduct system evaluation |
5.1.1. Integrate evaluation processes into the future strategy |
Prioritise a select number of challenges and goals to focus the education strategy |
Clearly present evidence to justify the education system’s top challenges |
||
Include precise targets in the indicator framework |
||
Develop more detailed implementation plans |
||
5.1.2. Develop the capacity to conduct system evaluation |
Co-ordinate monitoring and evaluation responsibilities at the central level |
|
Ensure evaluation bodies have the resources needed to achieve their mandates |
||
Strengthen regional capacity for evaluation and improve accountability for educational quality |
||
5.1.3. Report on the quality of education regularly and promote the use of evidence to inform policy-making |
Regularly publish an analytical report about the education system |
|
Embed the use of evidence in the policy-making process |
||
5.2. Modernising the education management information system |
5.2.1. Address gaps in the development of Socrates and establish it as the central source of education data |
Establish protocols for data definition, collection and retrieval from schools |
Create quality assurance procedures to verify the accuracy of data entered |
||
Raise the prominence of Socrates by positioning it closer to central leadership |
||
Build staff capacity to implement Socrates |
||
5.2.2. Develop Socrates into a functional tool to inform decision-making |
Create a user-friendly interface to make education data easily accessible |
|
Consider using a national identification number to link data across agencies |
||
5.2.3. Develop the national indicator framework to guide the development of Socrates |
Introduce indicators and targets that focus on student learning |
|
Use the national indicator framework to prioritise data collection for Socrates |
||
5.3. Ensuring the national assessment supports system goals |
5.3.1. Align the national assessment with its stated purpose of system monitoring |
Review the target population and grade levels |
Standardise marking of the national assessments |
||
Consider electronic marking and moving to computer-based delivery in the future |
||
Maintain focus on foundation skills but consider adding items to the Grade 5 subject tests |
||
Set a realistic timetable to introduce national assessments |
||
Collect more contextual information about factors that impact student learning |
||
5.3.2. Improve the dissemination of national assessment results to support system goals |
Keep the national report but identify different benchmarks for comparisons |
|
Create reporting structures that maximise the formative value of national assessment |
References
[40] Abdul-Hamid, H. (2017), Data for Learning: Building a Smart Education Data System, International Bank for Reconstruction and Development, The World Bank, Washington DC, https://doi.org/10.1596/978-1-4648-1099-2.
[10] Abdul-Hamid, H. (2014), What Matters Most for Education Management Information Systems: A Framework Paper, World Bank, Washington DC, http://www.worldbank.org (accessed on 16 July 2018).
[37] Albania Supreme State Audit (n.d.), Republika E Shqiperise Kontrolli I Larte I Shtetit [Supreme State Audit], http://www.klsh.org.al/web/KLSH_1_1.php (accessed on 7 April 2019).
[35] Bryson, J. (2018), Strategic Planning for Public and Nonprofit Organizations: A Guide to Strengthening and Sustaining Organizational Achievement, John Wiley & Sons.
[4] Burns, T. and F. Köster (eds.) (2016), Governing Education in a Complex World, Educational Research and Innovation, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264255364-en.
[51] CEDS (n.d.), Frequently Asked Questions, https://ceds.ed.gov/FAQ.aspx (accessed on 23 October 2019).
[50] CEDS (n.d.), Why CEDS?, Common Education Data Standards, https://ceds.ed.gov/pdf/why-ceds.pdf (accessed on 23 October 2019).
[33] CNE (2018), Estado da Educação 2017, Conselho Nacional de Educação, Lisboa, http://www.cnedu.pt (accessed on 18 November 2019).
[30] Department of Education and Training (2018), 2017-18 Annual Report: Opportunity through learning, http://www.education.gov.au/annual-reports.
[46] DFID (2011), “National and international assessment of student achievement”, https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/67619/nat-int-assess-stdnt-ach.pdf (accessed on 13 July 2018).
[42] EESTi (n.d.), Education and Science, https://www.eesti.ee/eng/services/citizen/haridus_ja_teadus/isikukaart_eesti_ee_portaali (accessed on 5 November 2019).
[41] e-Estonia (n.d.), Education, https://e-estonia.com/solutions/education/ (accessed on 5 November 2019).
[28] ERO (n.d.), Education Review Office: About ERO, https://www.ero.govt.nz/footer-upper/about-ero/ (accessed on 12 August 2019).
[19] ESC (2017), Vlerësimi i Arritjeve të Nxënësve [Assessment of Students’ Achievements of the 5th Grade].
[45] European Commission (2018), Basic Skills, http://ec.europa.eu/education/policy/school/math_en (accessed on 28 March 2018).
[8] House of Commons (2011), “Accountability for public money, Twenty-eighth Report of Session 2010-11”, Committee of Public Accounts, http://www.parliament.uk/pac.
[7] Kaplan, R.S. and D.P. Norton (1992), “The Balanced Scorecard: Measures that Drive Performance”, Harvard Business Review (January-February), pp. 71-79.
[49] Kellaghan, T., V. Grenaney and S. Murray (2009), National Assessments of Educational Achievement Volume 5, The World Bank, http://dx.doi.org/10.1596/978-0-8213-7929-5.
[26] Kitchen, H., E. (2017), OECD Reviews of Evaluation and Assessment in Education Romania, OECD Publishing, Paris, https://doi.org/10.1787/9789264274051-en. (accessed on 13 August 2019).
[44] Lao-Peetersoo (2014), Introduction of Estonian Education Information System (EHIS), OECD, http://www.oecd.org/education/ceri/Birgit%20Lao-Peetersoo_Introduction%20to%20the%20Estonian%20Education%20Information%20System%20EHIS.pdf (accessed on 18 November 2019).
[38] Li, R. et al. (2019), OECD Reviews of Evaluation and Assessment in Education: Georgia, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/94dc370e-en.
[36] Mclaughlin, P. et al. (2017), Assignment Title: Data Verification and Quality Assessment Education Management Information System Afghanistan II Chief of Party, USAID, https://pdf.usaid.gov/pdf_docs/PA00MTBN.pdf (accessed on 12 July 2018).
[23] Ministry of Education, Arts and Culture (2017), Strategic Plan 2017/18 – 2021/22, Republic of Namibia, http://www.moe.gov.na/files/downloads/b7b_Ministry%20Strategic%20Plan%202017-2022.pdf (accessed on 18 November 2019).
[13] MoESY (2018), OECD Review of Evaluation and Assessment: Country Background Report for Albania, Ministry of Education, Sports and Youth, Tirana.
[15] MoESY (2016), Strategjisë së Zhvillimit të Arsimit Parauniversitar, për Periudhën 2014–2020 [Strategy on Pre-University Education Development 2014-2020], Ministry of Education, Sports and Youth, Tirana, http://www.qbz.gov.al (accessed on 16 January 2020).
[18] MoESY (2014), Dokumenti i Strategjisë së Zhvillimit të Arsimit Parauniversitar 2014-2020 [Draft Strategy on Pre-University Education Development 2014-2020], Ministry of Education, Sports and Youth, Tirana, https://arsimi.gov.al/wp-content/uploads/2018/07/Strategji_APU_dokumenti_perfundimtar_24_03_2015-1.pdf (accessed on 14 August 2019).
[16] MoESY, M. (2014), GRUPI I PUNËS PËR REFORMIMIN E ARSIMIT PARAUNIVERSITAR [Preliminiary report of the Pre-university Education System Reform.
[27] Nusche, D. et al. (2012), OECD Reviews of Evaluation and Assessment in Education: New Zealand 2011, OECD, http://dx.doi.org/10.1787/9789264116917-en.
[48] OECD (2019), OECD Reviews of Evaluation and Assessment in Education: North Macedonia, OECD Publishing, Paris, https://doi.org/10.1787/079fe34c-en.
[12] OECD (2018), Education Policy Outlook 2018: Putting Student Learning at the Centre, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264301528-en.
[9] OECD (2018), Open Government Data Report: Enhancing Policy Maturity for Sustainable Impact, OECD Digital Government Studies, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264305847-en.
[22] OECD (2018), “Toolkit for the preparation, implementation, monitoring, reporting and evaluation of public administration reform and sector strategies: Guidance for SIGMA partners”, SIGMA, No. 57, http://www.sigmaweb.org/publications/SIGMA-Strategy-Toolkit-October-2018.pdf.
[34] OECD (2017), SIGMA Monitoring Report: The Principles of Public Administration Albania, OECD Publishing, Paris, http://www.sigmaweb.org/publications/Monitoring-Report-2017-Albania.pdf (accessed on 1 July 2019).
[3] OECD (2017), Systems Approaches to Public Sector Challenges: Working with Change, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264279865-en.
[11] OECD (2015), Education at a Glance 2015: OECD Indicators, OECD Publishing, Paris, https://dx.doi.org/10.1787/eag-2015-en.
[1] OECD (2013), Synergies for Better Learning: An International Perspective on Evaluation and Assessment, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264190658-en.
[47] OECD (2011), Education at a Glance 2011: OECD indicators, OECD Publishing, Paris, https://doi.org/10.1787/eag-2011-en (accessed on 3 August 2018).
[6] OECD (2009), Measuring Government Activity, OECD Publishing, Paris, https://www.oecd-ilibrary.org/docserver/9789264060784-en.pdf?expires=1539857706&id=id&accname=ocid84004878&checksum=50FFFE642692824D66D1DE143A215EA7.
[20] OECD (2003), Reviews of National Policies for Education Reviews of National Policies for Education South Eastern Europe, OECD Publishing, http://www.SourceOECD.org,.
[21] Republic of Albania (2017), “Strategjisë Kombëtare për Shkencën, Teknologjinë dhe Inovacionin, 2017-2022 [Strategy of Research, Innovation and Science 2017-2022]”, Center for Official Publications for the Republic of Albania, Vol. 701, http://www.qbz.gov.al.
[17] Republic of Albania (2013), National Strategy for Development and Integration 2014-2020, Council of Ministers draft, Tirana.
[31] Santiago, P. et al. (2012), OECD Reviews of Evaluation and Assessment in Education: Portugal 2012, OECD Publishing, http://dx.doi.org/10.1787/9789264117020-en.
[32] Santiago, P. et al. (2012), OECD Reviews of Evaluation and Assessment in Education: Czech Republic 2012, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264116788-en.
[2] Schick, A. (2003), “The Performing State: Reflection on an Idea Whose Time Has Come but Whose Implementation Has Not”, OECD Journal on Budgeting, Vol. 3/2, http://dx.doi.org/10.1787/budget-v3-art10-en.
[43] The European Agency for Special Needs and Inclusive Education (2019), Estonia Background Information, https://www.european-agency.org/data/estonia/background-info (accessed on 18 November 2019).
[5] The World Bank (2004), Ten Steps to a Results-Based Monitoring and Evaluation System, World Bank, Washington DC, https://www.oecd.org/dac/peer-reviews/World%20bank%202004%2010_Steps_to_a_Results_Based_ME_System.pdf (accessed on 15 November 2019).
[29] UNESCO (2017), Albania Education Policy Review: Issues and Recommendations (extended report), UNESCO, Paris, https://unesdoc.unesco.org/ark:/48223/pf0000259245.
[24] Vági, P. and E. Rimkute (2018), “Toolkit for the Preparation, implementation, monitoring, reporting and evaluation of public administration reform and sector strategies: guidance for SIGMA partners”, SIGMA Papers, No. 57, https://doi.org/10.1787/37e212e6-en.
[25] Viennet, R. and B. Pont (2017), “Education policy implementation: A literature review and proposed framework”, OECD Education Working Papers, No. 162, OECD Publishing, Paris, https://dx.doi.org/10.1787/fc467a64-en.
[39] Villanueva, C. (2003), Education Management Information System (EMIS), UNESCO, http://unesdoc.unesco.org/images/0015/001568/156818eo.pdf (accessed on 1 August 2018).
[14] Wort, M., D. Pupovci and E. Ikonomi (2019), Appraisal of the Pre-University Education Strategy 2014-2020, UNICEF Albania, Tirana.