Higher education contributes immensely to economic growth, social progress, and overall quality of life through the skills students and graduates acquire. Qualifications awarded by higher education institutions are valued because they are perceived to signal the skills required by labour markets and broader society. Employers use these qualifications as ways to identify and select job candidates who master essential and requisite skills. Higher education is trusted by employers and society to the extent that there is an equilibrium between skills supply and demand.
However, there are signs that the skills supply of graduates no longer matches skills demand in the labour market. Quantitative qualifications mismatch is turning into a severe issue in many countries, compromising productivity, growth and the continued increase in prosperity. Even more significant is the qualitative mismatch between the skills demand generated by the economic and social reality in labour markets and societies, and the supply of skills by higher education institutions. Employers and economic organisations express with increasingly louder voices that they are no longer confident that graduates have acquired the skills needed for the 21st-century workplace, in particular, generic skills such as problem solving, communication, creativity, and critical thinking.
Whether perceived or real, skills mismatch poses a serious risk to the trustworthiness of higher education. What is needed is more transparency about the skills students acquire. Unfortunately, this has not been a strength of most higher education systems. Transparency tools such as international rankings are quite good at capturing research-related measures or input measures in education quality but do not provide any insights into students’ actual learning outcomes. The few available measures, for example, provided by the OECD Survey of Adult Skills (PIAAC), are far from sufficient and invigorate the demand for more and better metrics.
Between 2008 and 2013, the OECD led the Assessing Higher Education Learning Outcomes (AHELO) feasibility study. Despite a positive conclusion on the feasibility of the initiative, the proposal tabled by the OECD in 2015 to start the main study did not attract sufficient support and the project was abandoned. However, a small number of countries that supported the project decided to continue the endeavour at a smaller scale. The collaborative work concentrated on what was perceived to be the most interesting and urgent issue, i.e. the assessment of the generic skills of higher education students and graduates. The initiative found a partner in the Council for Aid to Education, Inc., a non-profit organisation in the United States with a long history of assessing generic skills in post-secondary education with its proprietary Collegiate Learning Assessment (CLA+) instrument. This volume reports on the work pursued between 2016 and 2021 to assess critical thinking and written communication, and associated skills in higher education institutions in six countries (the United States, the United Kingdom, Italy, Mexico, Finland, and Chile).
Part I explores the conceptual and methodological dimensions of assessing students’ generic learning outcomes. Chapter 1 outlines the issues regarding changing skills demand, skills mismatch, transparency, and trust in higher education.
Chapter 2 provides an extensive discussion of the methodological qualities of the CLA+ international instrument, which was used in the participating institutions and countries.
Chapter 3 provides a detailed insight into the development of the CLA+ International project, including the practicalities of translation and adaptation, test administration, scoring and reporting.
Part II of this report includes a statistical analysis of the integrated international database. The database has been constructed by aggregating the datasets from the assessments implemented between 2015 and 2021 in the six countries.
Chapter 5 includes the descriptive statistics of the database and the general distribution of mastery levels of scores and subscores.
Chapter 6 explores the relationships between demographic background variables and performance on the assessment, focusing on students’ primary language, gender and parental educational attainment.
Chapter 7 discusses the relationship between test scores and post-higher education career outcomes.
Chapter 8 examines differences in performance by instructional format and field of study.
Chapter 9 addresses performance differences between countries for entering and exiting students (excluding Italy).
Part III of this report discusses the assessment in each of the six participating countries. Each chapter reviews policy context, test administration, mastery levels, score distribution and data regarding effort and engagement.
Chapter 10 discusses the assessment in the United States. The CLA+ assessment has a long history in the United States and the test has acquired strong status and recognition. As discussed in Chapter 11, Italy was the first country outside the United States to implement the CLA+ assessment as part of its nation-wide TECO project and its decision to move towards a different assessment approach.
Chapter 12 offers insight into the assessment in Finland, which implemented a system-wide administration in 2019-20.
Chapter 13 discusses the implementation of the CLA+ in a small set of institutions as part of a pilot study to assess learning gain in the United Kingdom. This case study shows the capacity of the assessment to serve as a diagnostic tool. The chapter also discusses the challenges associated with student recruitment and motivation.
Chapter 14 discusses the assessment in Mexico, more specifically the University of Guadalajara system, which has been one of the more enthusiastic early adopters of the CLA+ assessment outside the United States.
Chapter 15 deals with the test implementation and results in four private universities in Chile as part of an outreach attempt into Latin America. A similar situation is discussed in Chapter 16, which deals with the outlooks for implementing the assessment in professional and vocational colleges across Australia and New Zealand.
Finally, Chapter 17 summarises the main conclusions of the report and lessons learnt from the country experiences presented in the individual country chapters.
This report is a follow-up to the AHELO feasibility study and is one of the first international studies of generic skills proficiency in higher education institutions. It does not provide definitive answers but shows the power of assessing critical-thinking skills and how such assessments can feed into the policy agenda in higher education at national and international levels.