This chapter looks at international experience with vocational education and training (VET) assessment and certification. It discusses the need for standardisation and independence, as well as the importance of ‘holistic’ assessments. It looks at the role of employers and trade unions in planning and undertaking assessments, and how this can enhance the quality of assessment, and improve the credibility of certification.
Engaging Employers in Vocational Education and Training in Brazil
5. Assessment and certification in vocational education and training
Abstract
The role of assessment and certification in vocational education and training
The central aim of vocational education and training (VET) programmes is to provide graduates with the competences necessary to do specific jobs, and alongside the competences, the certification which assures employers and other stakeholders that graduates have those competences. Summative assessment, typically a precondition of certification, is therefore a key part of VET programmes, ensuring not only that job competences are part of a qualification and the taught curriculum, but also that the required competences have been acquired by graduates. Effective assessment is therefore a necessary element of a strong VET system.
In Brazil, as in other countries, the confidence of employers and trade unions – the social partners - in VET programmes is therefore likely to depend, among other factors, on the quality of assessment systems. Direct employer involvement in the design and implementation of assessments will in itself do much to enhance that confidence, and this is discussed throughout the chapter.
However, the objectives of assessment go wider than just demonstrating occupational competence to employers. Other objectives of such assessments include:
As formative assessment, to support and inform the learning process, through feedback to both learner and teacher.
To motivate students, recognising and rewarding their learning, and to signal the mix of competences they need to acquire.
To demonstrate the wider and deeper competences that go beyond immediate job skills, and underpin the further learning that will be necessary as jobs change and individual careers develop.
As aggregated statistics, to provide information about the performance of training providers, and the impact of changing policy and practice.
Assessment and certification in the wider VET context: Occupational standards and qualifications
Assessment and certification must build on an understanding of particular occupations and the competences they require. Countries often maintain a formal arrangement for identifying and classifying occupational standards. These standards describe occupations and the competences they require. Sometimes they may also be linked to associated qualifications/certifications. Passing an assessment, which tests out possession of these competences, then often becomes a precondition of certification. In Brazil, the Ministry of Labour maintains the “Classificação Brasileira de Occupações” (CBO) or the Brazilian Classification of Occupations. The CBO lists existing occupations in the Brazilian formal labour market, briefly describes the job content, as well as the education and experience requirements in terms of qualification level and field-of-study (OECD, 2018[1]). An example from another country of occupational standards and how governments may work with employers and unions to manage the system is given in Box 5.1.
Box 5.1. Occupational standards, qualifications and the assessment system in Estonia: Systematic involvement of the productive sector
In Estonia, the education ministry delegates responsibility for the professional qualifications system to a qualifications authority (Kutsekoda), steered by representatives of employers, unions and government working together. This body organises and co‑ordinates the activities of professional councils and keeps the register of professional qualifications.
Professional councils, representing 14 sectors, approve and update professional standards, and include representatives of trade unions, employer organisations, professional associations and public authorities. The professional standards set out the content of different occupations and the competences, which are expected of individuals in those occupations.
Professional councils select awarding bodies (public and private) to organise the assessment of competences and issue qualifications. The awarding bodies are selected for five years through a public competition organised by the qualifications authority. VET providers may also be given the right to award qualifications, if the curriculum of the institution complies with the professional standard and is nationally recognised.
To manage assessments, the awarding body sets up a committee involving sectoral stakeholders: employers, employees, training providers, and representatives of professional associations. Often it also includes customer representatives and other interested parties. This ensures impartiality in awarding qualifications. The committee approves assessment procedures, including examination materials, decides on awarding qualifications, and resolves complaints. It may set up an assessment committee that evaluates the organisation and results of the assessment and reports to the qualifications committee. The assessment committee verifies to what extent the applicant’s competences meet the requirements of the professional qualification standards. A person’s competences can be assessed and recognised regardless of whether they have been acquired through formal, non-formal or informal learning.
Source: Adapted from CEDEFOP (2014[2]), VET in Europe: Country Report 2014, https://www.cedefop.europa.eu/en/publications-and-resources/country-reports/estonia-vet-europe-country-report-2014. See also Kutsekoda Estonian Qualifications Authority (2021[3]), Estonian Qualifications Authority, https://www.kutsekoda.ee/en/.
Certification: The context for assessment
Certification usually depends on completion of approved training, as well as an assessment. In the context of vocational qualifications, certification is typically intended to confirm that an individual is occupationally competent. It also provides additional information. For example:
In England (United Kingdom), the T-Level vocational certificate will include separate grades for the core component, using A* to E, and for each occupational specialism, shown as pass, merit or distinction (DfE, 2020[4]).
Sometimes, more than one certificate is issued to reflect the different components of a programme and the connected assessments. For example in Germany, the graduating apprentice receives three certificates: one reflects an examination of professional competence, one a report of performance in the vocational school, and one a report on the apprentice’s performance by their training employer.
Certification commonly follows success in an assessment, but often depends on more than just an assessment, such as completion of a learning programme, or requirements for employment experience or placements with an employer. For example, upper secondary VET programmes in Brazil will be subject to various quality requirements, regarding curricula, number of learning hours, vocational teachers and so forth, that are designed to ensure that the training is of good quality. The assessment, alongside these requirements on the training programme, should provide assurance that those passing through the programme and successful in the assessment are competent. Employer involvement in assessment therefore sits alongside employer involvement in the design and delivery of the training programme in underpinning the credibility of certification.
Sometimes certification may depend almost entirely on assessment. In theory, if an assessment successfully tests everything required for occupational competence, it should not matter what type of formal or informal training has been undertaken. That theory underpins very radical recent reforms in Finland that remove a lot of regulation on training programmes (Box 5.2). But assessments are fallible: even with the strongest assessments, those who are not competent may pass assessments, and those who are competent may fail. So while we should aim for the best possible assessment, that aim needs to be tempered by realism.
Box 5.2. In Finland, deregulating the pathways to qualification
In 2018, Finland made large changes in its system of vocational education. At upper secondary level, a fixed three-year programme was replaced by a more ‘personalised’ model for both adults and young people. Under this model, students may start the programme at any point during the year, and follow an individual study path of no fixed length, adapted to their individual needs and allowing for recognition of prior learning. Increasing emphasis is placed on virtual learning environments, and work-based learning. These different study paths all lead to the same final qualifications. Assessment, which previously involved some emphasis on examinations, will now give primacy to the demonstration of vocational skills in the workplace (Finnish Ministry of Education and Culture, 2017[5]; Karttunen, 2018[6]).
At first sight, the attractions of this model, in responding to individual student needs, encouraging lifelong learning, and meeting employer needs are substantial. However it is asking a lot of local providers to expect them to manage an infinite variety of learning pathways while maintaining consistent national standards.
Adapted from Field, S. (2021[7]), A World Without Maps: Assessment in Technical Education, https://www.gatsby.org.uk/uploads/education/reports/pdf/assessment-in-technical-education-simon-field.pdf
One special type of assessment, linked closely to certification is recognition of prior learning (RPL), which is designed to help adults with work experience or unrecognised training, to obtain partial or full recognition and certification of what they already know how to do. Previous OECD work identified some gaps in Brazilian provision in this field relative to other countries (Box 5.3).
Box 5.3. Recognition of prior learning: How Brazil compares
In Brazil, while there is a decentralised programme for the formal recognition of prior learning, called “Rede CERTIFIC”, this programme was never fully developed and implemented. Very few schools became members of the “CERTIFIC” network and the number of certificates issued remains very small. As a consequence, individuals have no means of proving that the experience and knowledge they have accumulated over time is sufficient to enrol for training courses with entry requirements.
In response, the OECD recommended that a large-scale programme for the recognition of prior learning should be developed and implemented. Public awareness campaigns to highlight the benefits of participation in adult training should also be conducted. Such a system can improve the Brazilian adult learning systemin two respects: (i) it would contribute to engage older workers into adult learning, who might be excluded from the programme on the basis that they lack entry requirements; and (ii) it would establish a standardised framework to select candidates who do not possess formal qualifications, minimising the amount of discretion by staff working for different adult learning providers (OECD, 2018[1]).
Many countries have a system of RPL in place (Field and Guez, 2018[8]):
In France, a 2002 law establishes an individual right to the recognition of professional experience (validation des acquis de l’expérience, VAE) in the acquisition of an academic qualification. This allows an individual to obtain part or all the qualification based on professional experience. The candidate prepares an application documenting their relevant professional experience, which is then examined by a panel including both academic and professional members. The panel may then either grant the full qualification, or alternatively set out the courses that need to be followed by the candidate to complete the qualification. The qualification obtained is the same as can be obtained through academic study.
Mauritius implemented an RPL system in 2009. It follows the qualifications set up in its NQF and has been widely accepted by all stakeholders, so that RPL has been progressively extended to a larger range of sectors. First, the application for RPL is pre-screened. Then, if the pre-screening is successful, an RPL facilitator is assigned and helps the candidate build a portfolio, which collects the candidate’s employment history, evidence of skills, and any relevant experience, within a period of three months. The portfolio can include formal education results, samples of work produced, performance appraisal reports and references, photographs of work activities and written testimonials. After submission of the portfolio, an assessment against a selected qualification is carried out through an interview. If the applicant meets the standards, the assessment leads to the delivery of a full or partial qualification In Mauritius, RPL thus serves as a bridge and feeder to further and higher education.
The University of the Western Cape, in South Africa, provides RPL services to thousands of students. There are two ways for candidates to apply for admission to the university through RPL. First, the portfolio development course is a sixteen-week programme in which students attend lectures, write assignments and document their learning history to produce a portfolio that will be assessed in the application process. The course also prepares students for interviews and includes mentoring support. Second, Tests for Access and Placement includes two different tests: the first one assesses applicants’ motivation and level of prior learning, while the second (the National Benchmark Test) evaluates their potential to cope with the level of academic skills, writing and numeracy that are needed in higher education.
Building valid and reliable assessments
Any vocational assessment may be conceived as involving: first, a set of tasks which candidates are expected to perform, and second, procedures for evaluating these candidates on the basis of those tasks. So, for example, academic assessment tasks might often involve examination questions involving written answers, while the procedures for evaluation might involve a marking framework and guidelines, as well as organisational features such as trained examiners, and moderation and appeal procedures. For example in England, the assessment plan for a pipe welder includes tasks such as welding operations and health and safety procedures, while the procedures include rules such as the requirement that for an overall pass, the candidate must pass every assessment module. The outcome depends both on the tasks and the procedures (IfATE, 2021[9]).
Assessments are often appraised in terms of reliability (consistent standards) and validity (assessing the right competences) (New Zealand Ministry of Education, 2021[10]; Darr, 2005[11]).
Validity refers to the capacity of an assessment to accurately measure what it intends to measure. For a vocational assessment this will mean the capacity to accurately measure the ability of the candidate to perform well in the target occupation.
Reliability refers to consistency, so that the assessment applies the same standards to different individuals, in different contexts, with different assessors and assessment bodies, and does not change over time. Various quantitative measures of reliability are available.
These two characteristics are different, but not independent. A highly valid assessment is necessarily also reliable, since a very inconsistent measure cannot yield accurate predictions of occupational competence. However a reliable vocational assessment may have low validity (see Box 5.4 for France).
Box 5.4. Reliability, validity and medical training in France
In France, the six-year vocational programme for doctors used to terminate in a single national written examination, the épreuves classantes nationales (ECN). Marks received in this examination had a big influence on the career of a doctor, as places in more or less prestigious hospitals and specialities are determined by the marks. While this national examination removes the risk of variation in marking between test centres – a critical point given the influence the examination has on a medical career – it is being reformed because it is deemed to have insufficient emphasis on medical skills as opposed to knowledge. The new assessment tools will include simulations to measure practical medical and interpersonal skills. So an assessment of high reliability but questionable validity (the national examination) is being reformed in order to improve validity, but possibly also reducing reliability.
Source: Adapted from Field, S (2021[7]), A World Without Maps: Assessment in Technical Education, https://www.gatsby.org.uk/uploads/education/reports/pdf/assessment-in-technical-education-simon-field.pdf.
Standardisation
In academic education, strenuous efforts are made to standardise assessments by making both assessment tasks and procedures as stable and consistent as possible. This often involves national written examinations, with strictly controlled nationally organised procedures for appraising performance in these examinations so that all candidates face the same or very similar assessment tasks and are marked in the same way. In the context of VET, there are similar grounds for pursuing as much standardisation as possible, but, as will be explained below, there are some countervailing considerations. To begin with, it is worth emphasising what can readily be standardised in a vocational assessment.
The procedures for assessment in the sense of criteria for assessment, persons involved in the assessment, rules for resits and retakes and so forth should be, and usually are, designed to be as consistent as possible. Arrangements such as validation and external assessment, and other quality assurance measures, are often designed to reinforce procedural consistency and therefore reliability. In the Spanish Basque country for example, a calibration procedure is used to ensure that teachers use similar assessment and grading criteria. Every two years, groups of vocational teachers are brought together to correct the same written assignment independently and then compare outcomes, discussing their proposed grades and seeking consensus. The results of this grading discussion are recorded for future use (CEDEFOP, 2015[12]).
Some knowledge-based VET assessment tasks can readily be standardised. The knowledge element of occupational competence can often be appraised through written tests. Thus an electrician needs to understand the physics of electricity. This theoretical or knowledge dimension is often classroom taught, and typically assessed through written examinations, which may be standardised. In New Zealand, for example, trainee electricians must undergo a practical assessment and a final computer-based examination. The practical assessment is undertaken in a decentralised way by different training providers, with results submitted to the Electrical Workers Registration Board for approval. The final examination for an electrician involves a multiple choice, computer-based test undertaken in an examination centre. Candidates must have undertaken an approved course with an approved training provider to be eligible to take the examination. Resits are allowed but after three failures within three months, some retraining will be required (Electrical Workers Registration Board, 2021[13]).
Some practical skills may also be assessed in a standardised way, by defining a set of tasks expected of all candidates, and requiring candidates to undertake those tasks under controlled conditions, such as in a regional vocational assessment centre. In England this takes place through the AM2 test (National Electrotechnical Training, 2021[14]). In Switzerland, assessment of the practical skills of an apprentice involves first, an examination related to a standardised set of tasks or project which are the same for all candidates in the occupational field, and are usually conducted at the same time; and second, an individual practical project completed at the workplace and agreed with the individual employer. The project is presented by the apprentice to the examiners who award the grade (see (International Labour Organisation, 2020[15])).
Also in Brazil, standardisation is common in adult VET- see Box 5.5.
Box 5.5. Standardised assessment in SENAI programmes in Brazil
In SENAI courses, as soon as 80% of the training course has been completed, students can be asked to sit an on-line test to evaluate whether they have acquired the necessary skills for the occupation they are training for. Such tests are prepared by the faculty of SENAI and consist of multiple-choice questions. They assess specific skills that students should have developed during the training, but also, general and management competencies. On-line tests are common across all SENAI schools in the country and standardized. Students are also required to fill a short background questionnaire so as to provide information on their socio-economic context. However, such online tests have limitations in assessing real work challenges. Since 2017, a subset of the students who take the on-line test are also selected for a practical test. The practical test consists of presenting students with a concrete problem that could come up in their work routine and assessing the proposed solution. (OECD, 2018[1])
Source: OECD (2018[1])), Getting Skills Right: Brazil , https://dx.doi.org/10.1787/9789264309838-en.
In fields where working practice involves human subjects (as in healthcare), or expensive machinery (such as aircraft or CNC machines), technology-assisted simulation, where no persons or expensive machines are at risk, has large attractions, including for the standardisation of assessment. Simulation also allows both practice and assessment in the handling of rare but critical events, such as medical emergencies, or engine failures for pilots. A controlled set of challenges can be offered, both to train students, and subsequently to assess their skills. A substantial literature has emerged on the use of such technology, recognising both its potential and its limitations (for example in simulating the capacity to address interpersonal challenges). Simulation technology may also facilitate standardisation in assessment, so that candidates face the same, or similar challenges in a final examination (Ahn and Nyström, 2020[16]). In the reform of medical education in France, designed to enhance the assessment of practical skills, the intention is to use simulation technology, notably programmable robotic patients, not only to train but also to assess medical students (see (Ministère de Solidarité et de Santé, 2018[17])) and Box 5.6.
Work-embedded assessment tasks: A challenge to standardisation?
In the productive sector, ability to do the job is assessed most directly by looking at how well candidates perform authentic work tasks. For example, not just undertaking a standard task like fixing a leaking pipe in the case of a plumber, but also negotiating with a client, diagnosing the plumbing fault, working and communicating with other artisans, costing and scheduling the repair task, and dealing with unexpected vocational, practical and human challenges in the course of the work. Some soft competences and dispositions like teamwork, resilience and conscientiousness are critical to success in many workplaces (an issue further discussed below). In response, to address occupational competence more fully, assessment tasks embedded in the everyday reality of the workplace are utilised in many VET systems (as in Brazil recently in the SENAI assessments described above). In the Netherlands, this approach has been more fully embraced, as set out in Box 5.6.
The difficulty is that such authentic work tasks are, as many commentators have recognised, extremely difficult to standardise (Stanley, 2017[18]; Yu and Frempong, 2012[19]). As a result, there is always some tension between the objective of a fully standardised assessment delivering full reliability, and an assessment employing tasks which are fully realistic and reflective of authentic working practice.
Box 5.6. Work-embedded assessment projects in the Netherlands
In the Netherlands, the practical component of vocational assessment at upper secondary level is linked to work placements. A project, associated with a real professional activity, is chosen and approved (by the student, trainer at school and workplace trainer) at the beginning of the school year. The candidate must then carry out the project within the company (whilst treated as a regular employee) over a period of around six weeks. The student prepares a written report and a presentation, in which they are expected to demonstrate mastery of the required learning outcomes. Assessment and grading are undertaken by a minimum of two people to ensure impartiality – typically the school and workplace trainers. In their final year, students take part in three of four of these projects. This practical assessment is a component of a decentralised assessment system where the training providers themselves (or regional authorities) develop examination questions and tasks. However, since 2014 the certification exams in English, Dutch, and Mathematics which form part of the vocational programmes have been centralised.
Source: CEDEFOP (2015[12]), “Ensuring the Quality of Certification in Vocational Education and Training”, Cedefop Research Paper, No. 51, https://data.europa.eu/doi/10.2801/79334.
Competing advantages of standardised and work-embedded assessment tasks
Drawing on all the points made above, Table 5.1 summarises the pros and cons of using standardised tasks in VET assessments as opposed to assessment linked to more authentic working practice. Standardised tasks offer more confidence that the assessment applies the same standards to all candidates, and are often suitable for the cognitive, knowledge part of occupational competence, as they can be assessed using written examinations. Comparative performance of training providers on standardised tasks also provides helpful data aiding the quality assurance of providers. Conversely work-embedded assessment tasks are much more closely aligned with the realities of working practice, and better suited to the appraisal of certain higher level occupational competences and dispositions. Finally, and critically, employers obviously have a more natural role in both devising and undertaking such work-embedded assessments.
Table 5.1. The relative advantages of standardised and work-embedded tasks in assessment in vocational education and training
|
Standardised tasks |
Work-embedded tasks |
---|---|---|
Confidence that the same standards are applied to all candidates |
Yes |
More challenging |
Suitability to cognitive aspects of competence |
Yes |
More challenging |
Provide good data on the performance of training providers |
Yes |
More challenging |
Realistic work tasks assessed |
More challenging |
Yes |
Suitability to the assessment of soft cross curricular competences like teamwork |
More challenging |
Yes |
Engagement of employers in assessment |
More challenging |
Yes |
There is a strong case for work-embedded assessment as the most credible test available for occupational competence. But it would be self-defeating if the effect were to permit so much variation that pure luck comes to determine the difficulty of the assessment task, and hence who passes the assessment. In response to these competing considerations, a balance may be struck through steps such as the following:
As discussed above, the knowledge-based part of occupational competence can be assessed in a standardised way through written tests.
Also as discussed above, the procedures used to assess the work-embedded tasks may be subject to standardisation.
Assessment tasks, even if variable and work-embedded, may still be required to meet standardised requirements, for example to ensure that they always allow for the assessment of the key elements of occupational competence.
Standardised and work-embedded assessment tasks may be blended in a composite assessment. Sometimes the knowledge part of occupational competence can naturally be tested through a standardised national examination. In the Czech Republic, for example students in vocational upper secondary programmes are assessed through a combination of national exams and practical tests devised and managed by local vocational schools (CEDEFOP, 2015[12]).
Federal and decentralised assessment standards
In many countries with decentralised and federal governance arrangements, like Brazil, standardisation of assessment may take place at sub-national level. Clearly if qualifications/certifications are defined sub‑nationally, then the assessments need to follow suit. However if assessments procedures and tasks are agreed more locally than the associated qualification, an issue arises of whether the qualification obtained through one locally determined assessment requires the same standards as the same qualification obtained in a different area and therefore subject to a different assessment. Some of these challenges are apparent in some Italian qualifications (Box 5.7).
Conversely, in some federal systems, national assessments can be designed so as to address some regional differences in qualifications and programmes, and deliver a nationally recognised certification as in Canada. Although apprenticeship is managed and delivered by the different provinces and territories, a national assessment (the Red Seal examination) is developed in collaboration to meet requirements of all 13 provinces and territories. In this way, apprentices qualifying in one province or territory may obtain a nationally recognised endorsement to their certification in their chosen trade (see also Box 5.7 and Chapter 4).
Box 5.7. In Italy and Canada, assessments reflect and address regional variations
In Italy, alongside regular school-based upper secondary vocational education, there are regionally organised vocational programmes (IFP). The assessments for these programmes vary from region to region, but some elements are common. There are three types of tests, theory, practical, and oral, developed by institutions in some regions, and at regional level in others. Sometimes theory tests are prepared regionally and practical tests at institution level. The final mark is a weighted sum of marks from the last year of study, the final examination and sometimes an employer appraisal from a placement. The weighting varies between regions (Eurydice, 2020[20]).
In Canada, apprenticeship is managed by the separate provinces and territories, but a national examination is used to assess candidates in each of the Red Seal trades. Successful candidates receive a Red Seal endorsement to their certification which is recognized across Canada. The Red Seal examination, which is based on national occupational standards, contains between 100 and 150 multiple-choice questions, to be answered during a 4-hour examination. Around three‑quarters of candidates pass the exam (Canadian Council of Directors of Apprenticeship, 2016[21]).
Assessing the right set of skills at the right time
Holistic vs atomistic assessment
A second distinction, related to that between standardised and work-embedded assessment tasks, is that between holistic and atomistic assessment. In working life, employers most naturally assess potential recruits, ‘in the round’, or holistically, to decide whether they are capable of doing the job. Conversely, atomistic approaches break down occupational competence and assessments into their elements and check them off one by one. The linkage with the standardised/work-embedded distinction flows from how holistic assessment usually requires an authentic work-embedded assessment task. However atomistic approaches to assessment can be standardised or, alternatively, work-embedded.
The departure point for atomistic assessment is an atomistic approach to occupational competence and standards. This involves breaking down an occupation into the set of competences required to undertake that occupation. The competences include, for example in the case of a blacksmith, different elements of knowledge, such as knowledge of materials used, different skills such as design drawing, and dispositions and behaviours, such as a commitment to safety in the workplace. Full occupational competence is then defined in terms of the possession of this list of competences (Annex 5.A).
Given such a list of competences, an assessment may be designed to measure acquisition of each competence on this list. This establishes a transparent link between assessment and the list of competences which constitute occupational competence, a big advantage for quality assurance. It also allows assessments, for example, to identify a set of core competences which are always necessary, while having a more optional approach to other competences. Elements of knowledge can also be separately assessed from other competences – for example in a written examination that is separate from the assessment of practical skills. For a blacksmith in England, the associated assessment plan is described in Annex 5.A.
Under a holistic approach, occupational competence is sometimes associated with a professional identity that is over and above the tasks required in the job and the associated competences – competences which may come and go with changing technology and workplace organisation. This approach may depend partly on the profession: professions like teaching and nursing are associated with strong professional missions and values, independent of the changing set of competences necessary to deliver those missions. In other professions, like IT technicians, professional missions are harder to define. Given a professional identity, the argument is that occupational competence needs to be assessed in relation to that identity, rather than in relation to more contingent competences.
A separate argument for holistic assessment is that real work requires not just a list of separate competences, but also the capacity to apply a judiciously chosen set of those competences in response to complex and sometimes unexpected practical challenges. Such a high-level capacity is both critical to success in the workplace and more naturally addressed by holistic assessment. The terminology can be confusing, as such capacities are variably characterised, in overlapping but varying notions, as ‘soft’, ‘21st century’, ‘meta-cognitive’ and ‘cross-curricular’ skills and competences. Thus, for example:
In Scotland (United Kingdom), ‘meta-skills’, including self-management, social intelligence and innovation, have been argued to be the key to future-proofing the skills system in Scotland. It is maintained that such competences are not easily taught in a classroom, and they can most naturally be developed in a work-based context. By the same token, they are difficult to measure, and therefore assess, except in the context of regular work, or special projects that closely mimic the demands of ordinary work (Skills Development Scotland and Centre for Workbased Learning, 2018[22]).
Bjaelde, Lauridsen and Lindberg (2018[23]) describe how assessments of higher-level professional competences (often at tertiary level) require candidates to solve ‘authentic’ work problems with competences that may include, for example, teamwork, critical thinking and interpersonal skills.
Table 5.2 summarises the pros and cons of atomistic and holistic approaches. More atomistic approaches make it easier to check that all required skills for an occupation have been tested, and allow periodic modular assessments associated with each training module to check on the acquisition of the competence(s) developed in each module. Standardisation is also facilitated, for example through agreement on the relative importance of different competences in assessment. Conversely, holistic approaches more naturally fit with the assessment of real work tasks; they can also more readily capture the soft skills necessary to solve multi-faceted problems. Employers will also recognise such assessment tasks more naturally since they are work based.
Table 5.2. The relative advantages of atomistic and holistic approaches to assessment in vocational programmes
|
Atomistic assessment |
Holistic assessment |
---|---|---|
Confidence that all relevant competences have been identified and assessed |
Yes |
More challenging |
Allow for modular assessment and partial credit |
Yes |
More challenging |
Support standardised assessment |
Yes |
More challenging |
Realistic work tasks assessed – consistent with a holistic view of professional identity |
More challenging |
Yes |
Suitability to the assessment of soft skills, in which candidates deploy a mix of competences to solve a problem |
More challenging |
Yes |
Engagement of employers in assessment |
More challenging |
Yes |
The atomistic and holistic approaches are not mutually exclusive. Occupational competence involves both atomistic elements of knowledge and skills, and the capacity make use of these competences to solve workplace challenges. With this point in mind, it is natural and desirable for assessments to reflect both approaches. One example of this type of blend is the model of Luxembourg (Box 5.8).
Box 5.8. A blend of periodic modular and holistic final assessment in Luxembourg
In Luxembourg, the vocational education system has many similarities with that of Germany, with a dual system of apprenticeship at upper secondary level, alongside some school-based vocational programmes. Summative assessment involves both periodic and a final assessment (European Alliance for Apprenticeships, 2021[24]):
Modular periodic assessment. The programmes are organised in modules, each leading to a subset of competences for a specific occupation. Each module is assessed individually by the vocational education teacher or the in-company trainer responsible for the associated teaching or training. The apprentice must pass a fixed proportion of mandatory modules before entering the final assessment.
Final holistic assessment. A 2008 reform replaced theoretical and practical final exams with an assessment based on an integrated project, which corresponds to a simulated or real working situation, undertaken over a period of up to 24 hours. The integrated projects are developed and assessed by teams of experts from employer organisations, and vocational teachers from secondary schools (plus some additional assessors). Success in this final assessment leads to certification.
Assessing wider and higher level aspects of occupational competence
As already argued, many countries have been keen to include in their understanding of occupational competence, and reflect in vocational assessments, a range of wider and sometimes higher level competences that go beyond the performance of routine job tasks. These include the soft skills mentioned above, which are needed to fully exploit other competences. They also include basic skills that underpin the capacity to learn: literacy, numeracy and an increasingly important set of digital skills. Such skills underpin further learning, both directly on the job in response to new technological and other developments in the workplace, and to support further formal education and training that may be a necessary element of career development.
In recent decades, partly because of rising educational aspirations, and partly because of increasing skills requirements in the labour market, many vocational programmes and qualifications have been modified to facilitate access to further learning, including higher education (Field and Guez, 2018[8]). Consequently, vocational assessments have needed to give attention not just to the immediate ability to perform on the job, but also to the foundational competences that enable individuals to go on learning. These include numeracy and literacy, but also wider elements of general education. In many countries, this is realised through an upper secondary vocational track, which offers occupation-specific vocational training alongside a substantial component of general education (as also discussed in Chapter 2).
Vocational education and training programmes, as well as including general education, may also directly or indirectly develop other generic competences, including what are commonly called employability skills. These include traits such as self-discipline, honesty and determination and interpersonal skills including teamwork. These skills have significant labour market returns and are often best developed through work‑based learning rather than in classrooms (Lerman, 2013[25]). Some of these skills, notably interpersonal skills, have been shown to be of increasing relative importance in England, most plausibly because they correspond to the elements of occupations that are least subject to automation (Adecco, 2017[26]). In the United States, much attention has been given to these competences, and different tools developed to assess them independently of occupation-specific competences (Box 5.9).
Box 5.9. The United States is distinctive in making use of free-standing assessments of employability skills
High schools in the United States do not have the kind of systematic vocational tracks found in many European countries, but in a decentralised system which varies extensively from state to state and school to school, vocational courses are sometimes offered within a broadly comprehensive approach to upper secondary education. Much emphasis is placed on demonstrating that those graduating from high school are ‘career and college-ready’.
The ‘career-readiness’ element of this is subject to diverse assessment tools. In a survey across states, assessments of career readiness were classified into tests of academic, employability and vocational skills. Vocational skills were often assessed through industry recognised certifications of different types. But the United States is distinctive in making extensive use of tests of employability independently of specific vocational domains; these include ‘Work Keys’ (used in 32 states), and ASVAB (developed by the Department of Defense) also used in 32 states. Very often students themselves, alongside school districts, have to share the costs of these assessments (Centre on Educational Policy, 2013[27]).
Final, periodic and formative assessment
The most common assessment arrangement is a final examination of some type. For example one survey found that nine of eleven European countries surveyed tended to use final assessments in their initial vocational education systems, with the exceptions being Spain and Finland (CEDEFOP, 2015[12]). Such final assessments may be contrasted with ‘periodic’ assessments undertaken at intervals throughout a learning programme and contributing to the final grade. The arguments for final assessments are simple: they are administratively neat, and better placed than periodic assessment to assess occupational competence as an integrated whole through a holistic assessment, as discussed above. However several factors suggest that there is also value in periodic assessments conducted in the course of a learning programme:
Formative assessment uses information about learning progress to guide both teachers and learners and thereby support learning. It is therefore necessarily periodic. Much evidence shows that it is a powerful pedagogical tool in general education (Black and Wiliam, 1998[28]), suggesting, although direct evidence is limited, that the same might be true in vocational education (University of Exeter, 2021[29]). In a vocational programme this would, for example, involve the use of periodic tests, with the results fed back to both teacher and student to identify what has been learnt, and what remains to be acquired, thus guiding both teacher and learner in subsequent education and training. Much of this activity may be informal, and take place between the teacher and the student, corresponding to a form of personalised pedagogy. However it can also be formalised: Norway, for example, requires half yearly formative assessments in the consecutive school and workplace segments of apprenticeship programmes. In the final two-year workplace segment, half-year assessments are undertaken by the training supervisor of the apprentice in the training company. The supervisor is expected to explain to the apprentice what competences they still need to acquire, and how they can be acquired (Norwegian Ministry of Education and Research, 2006[30]).
Periodic assessment can also fit well with a modular approach in which individual elements of occupational competence are separately assessed, facilitating a record of partial learning. This can be used, as in Denmark, as a means of granting the possibility of later completion to those who might otherwise drop out with nothing to show for their efforts (Danish Ministry of Education, 2008[31]).
Earlier chapters of this report have described the critical value of workbased learning as part of vocational programmes. Assessing the learning outcomes from such placements therefore offers a strong indicator of occupational competence, and signals to the student the importance of what can be learnt in the workplace. But as these placements are removed from the main training provider for a vocational programme, assessing the learning outcomes can be challenging. Some countries make assessments of placements a more formal part of overall assessment. For example in France, the 22-week placements that are part of the baccalauréat professionnel are subject to an assessment by the teachers from their vocational schools, and represents a varying (according to profession) but substantial contribution to their overall mark in the baccalauréat (Field, 2021[7]).
The case for an element of periodic assessment is stronger in longer programmes, where students need formative feedback in the course of the programme, the risk of dropout is higher, and programmes may involve separate work placements requiring assessment. Thus, for example:
In German apprenticeships, an assessment normally takes place halfway through the 3‑4‑year programme to measure the apprentice’s acquisition of both theory and practical skills. This is used formatively to provide feedback on learning progress, but, increasingly, it is also used summatively, representing 30-40% of the final mark for the apprenticeship, depending on the profession (see the case of plumber assessment in Germany as set out in Annex 5.B).
In Swiss apprenticeships, some elements of periodic assessment, reflecting marks given by teachers in inter-company training courses, and in classroom-taught courses, contribute to the final mark in the overall assessment (Field, 2021[7]).
In Luxembourg, vocational assessment involves a unique mix of modular periodic assessment and a holistic, work-embedded final assessment (Box 5.8).
This points to the need for regular assessment to be followed as part of the VET programmes in Brazil.
Involving all relevant actors in the assessment process
Independence in assessment
The experience of individual teachers and trainers is valuable. In all types of learning, the teachers and trainers responsible for delivering vocational education and training will have most direct knowledge of the performance of trainees, and therefore naturally often play a big role in assessment. In Spain for example, teachers responsible for the different modules of technical programmes participate in assessment board discussions of grading individual students, and deciding whether students can continue to the second year of the programme (CEDEFOP, 2015[12]).
There are two difficulties with relying on local teaching and training staff to conduct assessments. First, these staff have biases — perhaps positive because the performance of the trainee can reflect on their own performance as trainers, or perhaps negative because of conflicts with the trainee. In the context of academic assessments there is evidence of different types of bias (Lee and Newton, 2021[32]). Second, these staff are not always able to apply the same standards as those applied in other contexts. For these two reasons, assessments undertaken by local teaching staff may not be reliable. This means that while it can be valuable to take trainer views into account in assessments, assessments also need a degree of independence from the training process. Such independence can help to enhance reliability, such as when the same external assessor helps to ensure consistent grading of students in different local contexts.
Sometimes independence is realised through nationally or regionally organised examinations, including practical assessments, which are independent of local teachers and trainers. But, as argued earlier, it is often desirable to assess occupational competence in real-world contexts, where local actors are necessarily involved. Some degree of assessment independence can then be realised by mixing an external independent assessment, with an internal element. Some examples follow:
The Australian state of Victoria is currently running an independent assessment trail in apprenticeship, whereby an independent assessing authority designs, develops and conducts the assessment of the selected eight occupations. Training providers are key partners in piloting the independent assessments (Victoria State Government, 2022[33]).
In Austria, the final assessment for apprentices includes both a practical and a theoretical component. The practical part of the examination is organised by the regional apprenticeship offices and managed by a board of examiners including a chairperson appointed by the regional advisory board on apprenticeship, one representative of employers and one of employees. The chairperson must be an authorised apprenticeship trainer and at least one other member of the board must be a professional expert (European Alliance for Apprenticeships, 2018[34]).
In Hungary, responsibility for assessment is shared between the technical school which organises the assessment and an independent examination committee (CEDEFOP, 2015[12]).
In Korea, technical qualifications are typically awarded after an internal assessment undertaken by a training institute, and an external assessment undertaken by the awarding body (Human Resource Development Korea), with the award depending on an adequate score in both assessments (Coles and Bateman, 2017[35]).
The role of social partners in assessment
The productive sector may be involved both in developing and undertaking assessments. For two good reasons, employers and worker representatives have a major role to play in assessment. First, those with most direct and up-to-date knowledge and experience of workplaces and working requirements are often best able to see what competences are required of workers, and how to test those competences. Second, the involvement of the productive sector in assessment grants it greater credibility, so that associated certifications will be granted more weight by other employers. Given this strong rationale, how can employers get involved in assessment? There are two main ways:
Through employer involvement in the development of assessments and assessment material (as well as in the underlying qualifications and occupational standards as discussed above), and in managing the whole process of assessment. This type of involvement is almost universal, although the depth of employer involvement is highly variable. One example of how this may work is apprenticeship in England (Box 5.10), where employers are fully involved in the establishment of occupational standards, and in providing a framework for assessment in an ‘assessment plan’, but employers are not involved in individual assessments.
Through direct employer involvement, as assessors, in individual assessments. This is much more variable, and in some contexts does not take place at all, with vocational teachers and examiners filling the role. However in some country contexts, the productive sector, through employer and worker representatives, are directly involved in undertaking individual assessments. One notable example is in apprenticeship in Germany, where representatives of both employer organisations and trade unions are required to take part in each individual assessment (Annex 5.B).
Employer involvement can make assessments more demanding, and therefore the certification more credible. The involvement of employers in individual assessments is linked to the issue of independence. A sharp distinction must be drawn between the involvement of an individual employer in respect of their own trainees or apprentices, where that employer may have local biases, and the involvement of employer representatives. Employer representatives are independent in the sense that they have nothing personally to gain or lose from an individual passing an assessment. Stakeholders such as employer representatives will however have a ‘point of view’, in the sense of a legitimate view about the threshold of performance required for a pass in the assessment. As representatives of employers collectively, they may want to set the threshold sufficiently high to be sure that those passing the assessment, and recruited by employers on that basis, are indeed competent. That may mean that they are less forgiving of marginal results in an assessment than other stakeholders, part of the price that is paid for ensuring that the assessment is fully credible. It is notable that the pass rate in the final assessment for apprentices in Germany (where employers are fully involved) is lower than in England (where employers are not routinely involved in undertaking assessments) (Field, 2021[7]).
Box 5.10. Employer involvement in apprenticeship programmes, certification and assessment in England
In apprenticeship in England, employers take the lead in establishing each apprenticeship qualification, through ‘Trailblazer’ group led by employers, which identify the competences required for a job (Annex 5.A). The same group also establish an ‘assessment plan’ setting out in some detail how apprentices are to be assessed at the end of each programme.
The individual employer of an apprentice is involved in assessment in that their approval is a necessary condition for the apprentice to proceed to the assessment. However the assessment itself is delivered by an independent assessment body, chosen from an approved list and selected by the employer of an apprentice.
The assessment plan, which runs to 24 pages in the case of a blacksmith for example, provides detailed guidance on how the assessment is to be conducted, marked and graded, including rules for resits and retakes. While assessment plans vary a lot between occupations, in the case of a blacksmith the plan sets out that assessment should take place through three exercises with equal weight:
A special project involve the production of a project piece submitted alongside a design/development document.
Through a skills demonstration, observed by an independent assessor, involving the completion of all five fundamental blacksmith tasks: a) Forging, b) Thermal welding and cutting c) Machining, d) Bench work, e) Tool making/maintenance.
Through a professional discussion underpinned by a workplace journal. This discussion will occupy a minimum of 40 minutes and will involve posing at least 8 competence questions.
Each of these elements is marked fail/pass/distinction using detailed criteria set out in guidance.
Using assessment data to support VET policy implementation and quality assurance
Assessment, as well as supporting individual certifications, has an important role to play in the implementation of the wider VET reforms currently under way in Brazil. Whether these concern vocational teachers, institutions, or employer engagement and work-based learning, their central objective will normally be to deliver well-trained individuals on completion of vocational programmes. Individual assessments, taken collectively, are therefore a vital test of whether such reforms are working – by looking for example at pass rates, or marks and how they are changing. At the institutional level, assessment data can become an indicator of the quality of training provided by that institution. Assessment is therefore a very important part of the evaluation feedback which should attend implementation of VET reform, and changes of teaching practice and policy. Internationally, while academic test data are widely used to evaluate schools and education policies, this practice is less common in the field of VET, partly because of the diversity of vocational fields of study, and partly because assessment (as discussed above) is not always standardised. In Poland, national standardised examinations, with external examiners in regional centres, are used to quality assure the training provided in different regions and schools (Chłoń-Domińczak, 2019[36]).
Assessment data may also be used to support quality assurance. Assessment bodies themselves, as part of internal quality assurance and improvement, can check that their assessments are providing reliable tests of occupational competence, by seeking feedback from employers on any gaps in the competences of newly qualified persons, and addressing such gaps through modifications in assessment and though advice to training providers. For example:
The government of Western Australia provides guidance on a continuous improvement approach to assessment practice. This requires training providers to regularly review their assessment processes and outcomes to identify flaws and to pinpoint the scope for improvement. This involves a data collection strategy (for example client satisfaction surveys, and data from consultation with learners and industry bodies) to monitor assessments and support quality improvements. When assessment practice is modified to make improvements, the impact of the changes in assessment practice is monitored (Government of Western Australia, 2018[37]).
in England, the Institute for Apprentices and Technical Education (IfATE) emphasises the importance of continuous improvement in their quality assurance framework, and identify the strongest end-point assessment (EPA) organisations as ones that see themselves as “learning organisations” constantly improving their performance through both internal quality assurance and feedback from stakeholders (IfATE, 2020[38]).
Conclusions
Assessments in the Brazilian VET system should balance different assessment methods. To support reliability, assessments should include some standardised elements, such as written or practical assessment tasks which are the same or very similar for all candidates. However, there is also a need to assess the performance of candidates undertaking realistic work tasks, or pursuing practical projects in the workplace. These tasks or projects should be carefully chosen so as to assess a wide range of competences required for the occupation, including soft and meta- skills such as creativity and teamwork, as well as more narrowly defined occupational skills. In longer programmes, partial assessments undertaken periodically in the course of a programme can play a very constructive role in providing feedback to students and teachers on learning progress, offering partial credit, as well as potentially feeding into a final assessment.
Full involvement of the productive sector, including employers and trade unions, enhances the quality of assessment and certification, and improves the credibility of certification. The productive sector should be involved fully both in the establishment of new curricula in the expanded VET system in Brazil and in updating existing curricula, as well as in the planning of assessment systems, as the productive sector has the most direct and up to date knowledge and experience of required competences. The sector might also be usefully involved in undertaking assessments of individual students, as this will add credibility to the consequent certification of occupational competence.
Brazil should include an independent element in assessment. Those most closely involved in a training programme, including vocational teachers and employers offering work placements, have direct knowledge of students and their capacities and have a useful input into assessment. This should be balanced by independent actors in assessment, who may be less likely to have biases because of any direct interest in the outcome, and who are in a stronger position to ensure consistent standards.
References
[26] Adecco (2017), AUTONATION. analysing the poential risks and opportunities automation could bring to Britain’s labour market.
[16] Ahn, S. and S. Nyström (2020), “Simulation-based training in VET through the lens of a sociomaterial perspective”, Nordic Journal of Vocational Education and Training, Vol. 10/1, pp. 1-17, https://www.diva-portal.org/smash/get/diva2:1463505/FULLTEXT01.pdf.
[44] Appreniceship toolbox (2019), Examination & Certification in Germany, https://www.apprenticeship-toolbox.eu/standards-matching/examination-certification/90-examination-certification-in-germany (accessed on 25 January 2022).
[48] Australian Skills Quality Authority (2021), Clauses 1.8 to 1.12—Conduct effective assessment, https://www.asqa.gov.au/standards/training-assessment/clauses-1.8-to-1.12#:~:text=The%20RTO%20informs%20the%20learner,and%20be%20reassessed%20if%20necessary.&text=Assessment%20is%20flexible%20to%20the,reflecting%20the%20learner's%20needs.
[43] BIBB (2017), , http://www.nuv.cz/uploads/EQAVET/soubory/BIBB_Quality_Assurance_2017.pdf (accessed on 25 January 2022).
[46] BIBB (2017), Young people study in the company and at school, https://www.bibb.de/en/77203.php (accessed on 25 January 2022).
[47] BIBB (2016), Informationen zu Aus- und Fortbildungsberufen, https://www.bibb.de/dienst/berufesuche/de/index_berufesuche.php/profile/apprenticeship/110512 (accessed on 25 January 2022).
[23] Bjaelde, O., K. Lauridsen and A. Lindberg (2018), Current Trends in Assessment in Europe: the Way Forward, https://www.coimbra-group.eu/wp-content/uploads/WP-Trends-in-assessment-FINAL.pdf.
[28] Black, P. and D. Wiliam (1998), “Assessment and Classroom Learning”, Assessment in Education: Principles, Policy and Practice, Vol. 5/1, pp. 7-7, https://www.gla.ac.uk/t4/learningandteaching/files/PGCTHE/BlackandWiliam1998.pdf.
[45] Bliem, W., A. Petanovitsch and K. Schmid (2016), Dual Vocational Education and Training in Austria, Germany, Liechtenstein and Switzerland. Comparative Expert Study, https://ibw.at/bibliothek/id/413/.
[21] Canadian Council of Directors of Apprenticeship (2016), 2016 Annual Review, http://www.red-seal.ca/docms/2016ar_eng.pdf.
[40] CEDEFOP (2017), Spotlight on VET: Germany, https://www.cedefop.europa.eu/en/publications/8116.
[12] CEDEFOP (2015), “Ensuring the Quality of Certification in Vocational Education and Training”, Cedefop Research Paper, No. 51, Publications Office of the European Union, Luxembourg, https://data.europa.eu/doi/10.2801/79334.
[2] CEDEFOP (2014), VET in Europe: country report 2014, https://www.cedefop.europa.eu/en/publications-and-resources/country-reports/estonia-vet-europe-country-report-2014.
[41] Cedefop (2017), Spotlight on VET in Germany, Cedefop, https://www.cedefop.europa.eu/files/8116_en.pdf.
[27] Centre on Educational Policy (2013), Career Readiness Assessments across States: A Summary of Survey Findings, https://files.eric.ed.gov/fulltext/ED554578.pdf.
[36] Chłoń-Domińczak, A. (2019), Vocational Education and Training in Europe: Poland, https://cumulus.cedefop.europa.eu/files/vetelib/2019/Vocational_Education_Training_Europe_Poland_2018_Cedefop_ReferNet.pdf.
[35] Coles, M. and A. Bateman (2017), Towards Quality Assurance of Technical Vocational Education and Training, https://unesdoc.unesco.org/in/documentViewer.xhtml?v=2.1.196&id=p::usmarcdef_0000259282&file=/in/rest/annotationSVC/DownloadWatermarkedAttachment/attach_import_3ba90c4c-1cb8-407d-a917-81cb8106ce4b%3F_%3D259282eng.pdf&locale=en&multi=true&ark=/ark:/48223/p.
[31] Danish Ministry of Education (2008), The Danish Vocational Educational System, https://www.apprenticeship-toolbox.eu/files/144/Competent-Bodies/133/The_Danish_VET_System.pdf.
[11] Darr, C. (2005), “A hitchhiker’s guide to reliability”, https://assessment.tki.org.nz/Using-evidence-for-learning/Working-with-data/Concepts/Reliability-and-validity#:~:text=The%20reliability%20of%20an%20assessment,it%20was%20designed%20to%20measure.
[4] DfE (2020), Guidance. Introduction of T levels. Updated 4 September 2020, https://www.gov.uk/government/publications/introduction-of-t-levels/introduction-of-t-levels#grading-and-certification.
[13] Electrical Workers Registration Board (2021), Tuition Courses and Practical Assessments, https://www.ewrb.govt.nz/becoming-an-electrical-worker/tuition-courses-and-practical-assessments/.
[42] Euler, D. (2013), Germany’s dual vocational trainings: a model for other countries?, Bertelsmann Stiftung, https://www.bertelsmann-stiftung.de/fileadmin/files/BSt/Publikationen/GrauePublikationen/GP_Germanys_dual_vocational_training_system.pdf.
[24] European Alliance for Apprenticeships (2021), Examination and Certification in Luxembourg, https://www.apprenticeship-toolbox.eu/standards-matching/examination-certification/91-examination-certification-in-luxembourg.
[34] European Alliance for Apprenticeships (2018), Examination & Certification in Austria, https://www.apprenticeship-toolbox.eu/standards-matching/examination-certification/88-examination-certification-in-austria.
[20] Eurydice (2020), Assessment in Vocational Upper Secondary Education, https://eacea.ec.europa.eu/national-policies/eurydice/content/assessment-vocational-upper-secondary-education-24_en.
[7] Field, S. (2021), A World Without Maps: Assessment in Technical Education, https://www.gatsby.org.uk/uploads/education/reports/pdf/assessment-in-technical-education-simon-field.pdf.
[8] Field, S. and A. Guez (2018), Pathways of Progression: Between Technical and Vocational Education and Training and Post-Secondary Education., UNESCO, Paris, http://unesdoc.unesco.org/images/0026/002659/265943e.pdf.
[5] Finnish Ministry of Education and Culture (2017), Reform of vocational upper secondary education, https://minedu.fi/en/reform-of-vocational-upper-secondary-education.
[37] Government of Western Australia (2018), Training Accreditation Council. Fact Sheet. Assuring the Quality of RTO: Processes, Practices and Products, https://www.tac.wa.gov.au/SiteCollectionDocuments/D20%20006600.pdf (accessed on 24 January 2022).
[39] Haasler, S. (2020), “The German system of vocational education and training: challenges of gender, academisation and the integration of low-achieving youth”, Transfer: European Review of Labour and Research, Vol. 26/1, pp. 57-71, https://doi.org/10.1177/1024258919898115.
[9] IfATE (2021), End-point assessment plan for Pipe Welder apprenticeship standard, https://www.instituteforapprenticeships.org/media/3325/st0851_pipe_welder_l3_ap_final_for-publication_17072019.pdf.
[38] IfATE (2020), External Quality Assurance Annual Report 2020, https://www.instituteforapprenticeships.org/media/4724/eqa-annual-report.pdf.
[15] International Labour Organisation (2020), ILO Toolkit for Quality Apprenticeships: Volume 2: Guide for Practitioners, https://www.ilo.org/wcmsp5/groups/public/---ed_emp/---ifp_skills/documents/publication/wcms_751116.pdf.
[6] Karttunen, A. (2018), The big VET reform in Finland, https://nvl.org/content/the-big-vet-reform-in-finland.
[3] Kutsekoda Estonian Qualifications Authority (2021), Estonian Qualifications Authority, https://www.kutsekoda.ee/en/.
[32] Lee, M. and P. Newton (2021), Systematic divergence between teacher and test-based assessment: literature review, https://www.gov.uk/government/publications/systematic-divergence-between-teacher-and-test-based-assessment/systematic-divergence-between-teacher-and-test-based-assessment-literature-review#contents.
[25] Lerman, R. (2013), “Are employability skills learned in U.S. youth education and training programs?”, IZA J Labor Policy, Vol. 2/6, https://doi.org/10.1186/2193-9004-2-6.
[17] Ministère de Solidarité et de Santé (2018), Pour Les Étudiants Et Leurs Futurs Patients, Des Études Médicales Rénovées, https://solidarites-sante.gouv.fr/IMG/pdf/180705_-_dp_-_etudes_medicales_renovees.pdf.
[14] National Electrotechnical Training (2021), AM2, https://www.netservices.org.uk/am2/.
[10] New Zealand Ministry of Education (2021), Reliability and Validity, https://assessment.tki.org.nz/Using-evidence-for-learning/Working-with-data/Concepts/Reliability-and-validity#:~:text=The%20reliability%20of%20an%20assessment,it%20was%20designed%20to%20measure.
[30] Norwegian Ministry of Education and Research (2006), Education Act, https://lovdata.no/dokument/SF/forskrift/2006-06-23-724/KAPITTEL_5-2#KAPITTEL_5-2.
[1] OECD (2018), Getting Skills Right: Brazil, Getting Skills Right, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264309838-en.
[22] Skills Development Scotland and Centre for Workbased Learning (2018), Skills 4.0 A Skills Model to Drive Scotland’s Future, https://www.skillsdevelopmentscotland.co.uk/media/44684/skills-40_a-skills-model.pdf.
[18] Stanley, G. (2017), Accreditation and Assessment in VET, OUP.
[29] University of Exeter (2021), Improving Formative Assessment in Vocational Education and Literacy, Language and Numeracy, http://education.exeter.ac.uk/ifa/.
[33] Victoria State Government (2022), Independent Assessment Portal, https://independentassessment.vetassess.com.au/ (accessed on 2022 January 24).
[19] Yu, K. and G. Frempong (2012), “Standardise and individualise – an unsolvable tension in assessment?”, Education as Change, pp. 143-157, https://doi.org/10.1080/16823206.2012.692210.
Annex 5.A. Defining occupational competence for a blacksmith in England: Knowledge, skills and behaviours required
Knowledge
Health & Safety (H&S) - health & safety process, legislation and regulations in the forge and on site including COSHH, H&S at work act 1974. Knowledge of safe work processes that ensures the safety of self and others such as personal health surveillance, hazard recognition, risk assessment, method statements, disposal of waste, equipment inspection, personal protective equipment. Knowledge of exposure, risk and prevention of flash burns, arc eye, radiant heat, noise exposure and fumes as well as knowledge of preventing musculoskeletal and manual handling injuries.
Materials - the properties and uses of materials used for blacksmithing such as the effects of heat and working on forgeable metals. The effects of combining metals and other media such as wood, stone or plastic. Modes of supply, methods for handling and storing resources. The effects of the environment and techniques for protecting metalwork.
Tools - the key equipment, fixed and hand tools, the principles of operation, manufacture, set up, maintenance and safe use. Hand tools such as tongs, punches, chisels, hammers, anvil tools and jigs. Hand held machine tools such as drills and grinders. Fixed forge equipment such as power hammers, presses, forges, furnaces. Fixed fabrication equipment such as guillotines, rolls, metalworkers and linishers. Fixed welding equipment such as welding plant, profile cutters and extraction systems. Fixed machine equipment such as drills, lathes, milling and grinding machines.
Quality - knowledge of quality standards including those expected by the client, employer, suppliers and regulatory bodies, including methods of recording work, use of product data sheets, International Standardisation Organisation (ISO) 9001, Conformete European (CE) marking and building regulations.
Design – knowledge of elements and principles of design, drawing conventions and techniques by hand or computer aided design (CAD). Interpreting models and samples as part of the initial design process when presenting an idea to a client or as a component of design realisation when working out production samples.
Manufacture, conservation and repair of metalwork - Finishing and protection methods and processes. The occupational roles and responsibilities within the processes regarding work relationships such as knowledge of those responsible for advising the client or other relevant parties, an appreciation of the costs of time and materials in the production of forged ironwork and the issues involved in seeking approval for work to commission or for direct retail.
Setting up for work, problems that may occur and how to respond to them, knowledge of relevant mathematics and science such as volumes of metal required when calculating forging allowances, linear calculations for frameworks and bending, trigonometry for squaring and calculating angles when setting up working drawings or constructions, the chemistry and physics of ferrous metals in their heat treatment, the properties of common alloying elements and the chemistry of corrosion and its causes.
Context of the craft - the context of the craft such as design styles, notable blacksmiths and artistic movements. Historical and contemporary processes and techniques.
Skills
Health & Safety and working environment - maintain standards of health and safety for self and for others, using safe working practices. Prepare and maintain a safe working environment, where both hand and mechanical tools are used, as well as being able to safely handle fuel and light and operate the forge. Identify hazards and minimise risk in the working environment.
Vocational interpretation and understanding - create and interpret specifications, samples, drawings, and other written and verbal instructions for the manufacture or repair of metalwork. The identification and appropriate response to problems such as calculating jointing, forging and bending allowances, creation of working templates or jigs from drawings, arriving at an appropriate order of dismantling and construction including testing and adjustment, seeking advice and guidance as appropriate.
Design – produce vocational drawings, designs and models by hand or computer aided design (CAD) which can be interpreted by colleagues and clients when developing the final product.
Manufacturing and repair processes - select and use the appropriate processes, techniques, materials, tools and equipment for manufacture or repair of metalwork and undertake the blacksmith making process to the industrial standard from inception to realisation. Plan and manage time effectively.
Hot Forging - efficiently manage a forge or furnace when using forge tools to hot forge, form, cut and join metals by hand and machine.
Thermal Welding and cutting - use hand operated thermal equipment, cutting and joining techniques to cut and join metals.
Machining - use hand operated machine tools for cutting, drilling and shaping components.
Bench work - use hand tools to cold cut and shape materials. Join materials using fastening systems.
Tools, materials and equipment - carry out testing and adjustment. Manufacture, prepare and maintain materials, equipment and tools appropriately. Manufacture and maintain hand tools such as tongs, punches, chisels, hammers, anvil tools and jigs. Maintain equipment such as hand held machine tools, fixed forge equipment such as power hammers, presses, forges and furnaces. Fabrication, welding and engineering equipment. Preparing materials such as consumables, metal for the job, fixings and coatings.
Finishing – clean, prepare and protect metalwork. Finish surfaces by specifying and applying specified surface treatments, coatings or coverings as required such as wire brushing, degreasing, descaling, polishing, waxing, oiling, painting and specifying sub-contract finishes such as hot dip galvanising, electro polishing and powder coating.
Fitting - construct and fit work in the workshop and/or on site as appropriate, which includes assembly and dismantling of components and products and correcting faults in metalwork.
Behaviours
Promote positive safety culture – ensure at all times that work is carried out in a safe way that does not put themselves or others at risk.
Quality focused - work to appropriate quality standards such as working to client requirements, workshop samples, drawing specifications, historical listings, building regulations and workshop procedures, with efficient use of time, materials and resources. Record work and either self‑evaluate or obtain feedback from others to improve work and working practice.
Professionalism - have a strong professional work ethic including pride in work and attention to detail. Recognise the need for efficient and clear communication and the importance of working effectively with others. Promote and represent the craft, apply ethics and professional judgment in all areas of work. Take responsibility for own work and monitoring the work of others
Self-development - keep up to date with best practice and emerging technologies within the sector. Obtain and offer constructive feedback to others, develop and maintain professional relationships.
Annex 5.B. Assessment in detail: Vocational assessment in Germany in apprenticeship - the example of plumber qualifications
Introduction: Apprenticeship in Germany
Roughly half of young people pursue apprenticeships in the dual system, a proportion that has been declining. In addition, for the some parts of the services sector there are programmes in vocational schools which are not apprenticeship.
The ability to change occupations and to switch between different school sites is limited and so the early allocation of pupils becomes a significant determinant of a candidate’s later career (Haasler, 2020[39])). The system has been criticised for its lack of potential for social mobility and perpetuation of inequalities (Haasler, 2020[39]). Conversely apprenticeships enable smooth transition into work, resulting in low youth unemployment (2015: 7.2% of aged 15 to 24 versus 20.4% in the EU-28) (CEDEFOP, 2017[40])
Dual system (apprenticeship) takes the form of training with an approved employer (70% of the programme time) and theoretical education in a vocational school (30% of the time). In 2016, 68% of graduates remained employed by the firm in which they had trained (Haasler, 2020[39]). The period of training in the dual vocational education and training system is, for the majority of professions, 3 or 3.5 years long. A small number of diplomas are offered for a training period of 2 years, with the option of extending this training period to 3/3.5 years so as to acquire the complete set of skills required of the specific profession. The Vocational Training Act and Handicrafts Regulation Act (Handwerksordnung) provide the framework for shortening or lengthening of training periods, and allows for differences in trainees pre-traineeship skills and abilities.
For example, training for the “skilled worker for metal technology” lasts 2 years and provides for the basic knowledge required. If the trainee ends this training programme successfully, they can then extend the training by 1.5 years in order to acquire the skills and competencies necessary to be employed as a metal worker.
The company and the vocational school both have an educational responsibility and provide training across both theory and practice. In other words, the vocational school is not limited to teaching theory and the training within a company is not limited to practical work.
Pathways to further learning
At a tertiary level, vocationally qualified applicants without a school-based higher education entrance qualification can access advanced vocational training (AVT) leading to qualifications at EQF level 6, such as a technical engineer or industriemaster (Cedefop, 2017[41]). Exams for Industriemeister, for example, are organised uniformly across the country at the Chamber of Industry and Commerce (IHK). A prerequisite for attendance is a subject-specific vocational qualification and sufficient professional practice (regulated differently in the various specialist areas).
Employer engagement in certification and assessment
Apprenticeship certification requires 2 exams. The first takes the form of an intermediate examination or apprenticeship certification exam part I. The second, the final examination or apprenticeship certification exam part II, is taken towards the end of the training programme (at the end of the 3/3.5 years). Both examinations cover both theory and practice. The intermediate examination is used to provide both the candidate and the education company with an idea of their current performance in theory and in practice. Increasingly, the intermediate examination results are being used summatively in addition, and they contribute 30-40% of the overall score, with the final examination contributing the remainder.
There is a strict division between teaching and assessment in the dual system. Responsibility for assessing vocational skills lies with the public statutory bodies: the chambers of commerce and crafts. When final examinations are administered, teachers/trainers do not evaluate their own students (avoiding bias) and learning sites are not involved in the examination process (Euler, 2013[42]).
Examinations are administered by the Chambers (statutory employer organisations) and their appointed examining boards. Each board should comprise equal numbers of employers’ and employee’s representatives (at least two‑thirds) and at least one vocational school teacher, all appointed by the competent body for a maximum period of five years (section 40, Vocational Training Act).
Quality assurance and oversight
The BIBB board provides recommendations for standardised model examination regulations in the vocational training act (BIBB, 2017, p. 41[43]).
The Vocational Training Act contains framework regulations for examinations (the details as to subject matter etc. are set forth by the relevant training regulations and ordinances on further training).
The chamber develops examination regulations subject to the approval of the federal government. The regulations cover: admission, structure, evaluation procedures, certification, breaches of the regulations and resits (BIBB, 2017[43]).
Quality Assurance of the examinations is based on impartiality and objectivity. This is based on regulations for the examiners, i.e. that the trainers are not involved in the assessment themselves. Also, that individual examination performances are evaluated by at least two examiners.
Accelerated and alternative entry routes to the final apprenticeship examination
Trainees can be admitted in advance (before the end of the training period) if their performance justifies it.
Individuals who have worked in a specific profession for a duration of 1.5 times the duration of the training period for said profession, or who can provide certifications demonstrating that they possess the equivalent and necessary skills and competencies, can also be admitted.
Grading and certification
The examinations are graded as follows:
100-92 points = 1 = excellent
91 - 81 points = 2 = good
80 - 67 points = 3 = average
66 - 50 points = 4 = pass
49 - 30 points = 5 = poor
29 - 0 points = 6 = fail.
If a candidate does not pass the examination, they can resit the assessment up to two times.
In 2016, 444 200 final exams (413 200 initial exams, 31 000 repeat exams) were held and 399 800 people completed their vocational training by passing a final examination. 91.9 % of participants (370 600 people) succeeded in doing so at their first attempt.
Certification
The apprentices receive three different certificates (Appreniceship toolbox, 2019[44]):
the examination certificate (Facharbeiterbrief/Gesellenbrief) provided by the chamber
the leaving certificate of the vocational school and
the reference of the training company.
A competent body, i.e. Chamber of Industry and Commerce (IHK) or the Chamber of Skilled Crafts and Small Businesses (HWK), issues the certificate. They organise the entire exam details (dates, exam committees) in the ‘Länder’ (federal states) and issue the exam certificates and final certificates (Bliem, Petanovitsch and Schmid, 2016, p. 34[45]).
Trainees can request that the result of their vocational school achievements will be listed on the chamber certificate. They can also request that their certificate be accompanied by a translation in English and/or French so as to promote mobility (BIBB, 2017[46]).
In-company trainers also issue a certificate to their apprentices at the end of the training relationship, indicating the nature, duration and objective of the vocational training and the vocational skills, knowledge and abilities acquired by the apprentice (BIBB, 2017[46]).
Example: Plumber
Taken from the Ordinance on Vocational Training for Plumbers and Plumbing Technicians (Klempner), From 21 June 2013 (BIBB, 2016[47]).
The journeyman's examination consists of the two parts 1 and 2.
Part 1 of the journeyman's examination (‘work order’ – 30% of marks)
Part 1 of the journeyman's examination shall take place before the end of the of the second year of training. the examinee should be able to:
use technical documentation, plan working steps measurements and to record them, to plan materials and tools
process material manually and mechanically, form, join and assemble material, produce templates and mould parts
take measures for work organisation, safety and health protection at work, environmental protection and quality assurance
to show the technical background relevant to the examination task
to show the technical background relevant to the examination task and to justify the procedure.
The examinee is to carry out a work task typical of the occupation (duration: 7 hours), to conduct a situational technical discussion (duration: 15 minutes) relating to it and complete tasks in writing (60 minutes) which relate to the content of the work task.
Part 2 of the examination
Part 2 of the journeyman's examination involves three components: ‘customer order’; ‘production, assembly and maintenance technology’ and ‘economic and social studies’.
1. customer order (40% of the marks).
The following requirements apply to the customer order examination area:
The examinee is to prove that he is able to work processes and sub-tasks taking into account economic, technical, organisational and time specifications
Process, manufacture and assemble components or subassemblies, test them for function and fit them
Check the results of work for accuracy of fit, secure and visual impression and to carry out corrective measures
Hand over components or assemblies to the customer, provide technical information, instruct customers and prepare acceptance reports
Explain the technical background relevant to the customer's order and to justify the procedure.
Areas that can be selected:
roof cladding;
façade cladding;
drainage systems for precipitation water;
moulded parts of ventilation technology.
The candidate is to produce a test piece (16 hours), document the production with documents customary in practice and conduct an order-related technical discussion (20 minutes max duration). The examination time is 16 hours.
2. Assembly and maintenance technology (20% of the marks).
The examinee is to prove that he is able to:
draw up working plans for customer orders and developments
describe the procedure for the manufacture of a component or an assembly of plumbing technology plumbing technology
identify faults, describe causes, assess the consequences and describe measures to eliminate them
to deal with technical problems with linked information technology, technological and mathematical facts
describe measures for preventive maintenance measures
take into account safety, economic efficiency and environmental protection into account.
The candidate shall complete a written assessment for the practical task. The examination time shall be 240 minutes.
3. Economics and social studies (10% of the marks).
In a written examination (60 minutes) in an occupation-related task, the candidate shall prove that they are able to present and assess general, economic and social contexts of the world of work and professions.
Passing regulations
The journeyman's examination is passed if:
The overall result of Part 1 and Part 2 is “sufficient”
The score for “customer orders” is "sufficient"
The result of Part 2 is "sufficient"
At least one further examination area of Part 2 is "sufficient"
No examination area of Part 2 can be "unsatisfactory”.