This chapter introduces the assessment frameworks that define and describe the skills assessed in Cycle 2 of PIAAC. It provides some background to the PIAAC assessment, outlines the purposes of the assessment frameworks and explains how the understanding and conception of the skills measured in PIAAC has evolved over time.
The Assessment Frameworks for Cycle 2 of the Programme for the International Assessment of Adult Competencies
1. The assessment frameworks for Cycle 2 of PIAAC: An introduction and overview
Abstract
Introduction
This volume contains the frameworks for the assessment of literacy, numeracy and adaptive problem solving in the second cycle of the OECD’s Programme for the International Assessment of Adult Competencies (PIAAC Cycle 2). This introductory chapter provides some context and background to the study as well as to the frameworks guiding the assessment. In particular, it describes:
the main features of the PIAAC assessment and how it relates to previous international assessments of adult literacy, numeracy and problem solving
the purposes of the assessment frameworks
the way in which the constructs assessed in PIAAC and its predecessors have been conceived.
The PIAAC assessment
PIAAC is an international assessment of the information-processing skills of adults. It assesses three broad skills: reading and understanding written texts (literacy), understanding and using mathematical and numerical information (numeracy) and solving problems. A comprehensive background questionnaire is also administered in conjunction with the assessment.
PIAAC is the third in a series of international adult assessments conducted since the mid-1990s. It builds on the experience of the International Adult Literacy Survey (IALS) and the Adult Literacy and Life Skills Survey (ALL).1 IALS collected data in three waves between 1994 and 1998 in 22 countries and regions. ALL collected data in two waves over the period 2002-2008 in 11 countries and regions.
The study is designed as a repeated cross-sectional study that provides comparable estimates of proficiency in literacy and numeracy over time. The first cycle of the assessment took place over the period 2008-2019 with three data collection rounds: the first in 2011-12, the second in 2014-15 and the third in 2017-18.2 A total of 39 countries/regions took part in the first cycle of the study and 33 are currently preparing to collect data in the second cycle (see Table 1.1). Preparations for Cycle 2 of the assessment began in 2018. Data collection was originally planned for 2021-22, ten years after data collection in the first round of Cycle 1, but due to the Covid-19 crisis of 2020 which delayed the Field Trial, it has been rescheduled to 2022-23.
Data are collected in PIAAC using a combination of personal interview and a self-completed assessment. Data collection takes place in the respondent’s own home3 under the supervision of trained interviewers. The background questionnaire is administered in Computer Aided Personal Interview (CAPI) mode by the interviewer. Following completion of the background questionnaire, the respondent completes the assessment under the supervision of the interviewer. In the first cycle of the study, the assessment could be completed on a laptop computer or in paper-and-pencil format. The computer-based assessment (CBA) format constituted the default format with the paper-based assessment (PBA) option being made available to those respondents who had little or no familiarity with computers, had poor information communications technology (ICT) skills, or who did not wish to take the assessment on computer. In the second cycle of the study, the assessment will be delivered on a tablet device. The assessment interface has been designed in such a way as to ensure that most, if not all, respondents will be able to take the assessment on the tablet even if they have limited experience with such devices. It will still be possible for participating countries to provide a paper-based option to respondents who cannot or are unwilling to take the assessment on the tablet.
Table 1.1. Countries and regions participating in PIAAC
PIAAC Cycle 1 |
PIAAC Cycle 2 |
||
---|---|---|---|
Round 1 |
Round 2 |
Round 3 |
|
Main study 2011-12 |
Main study 2014-15 |
Main study 2017-18 |
Main study 2022-23 |
Australia |
Chile |
Ecuador |
Australia |
Austria |
Greece |
Hungary |
Austria |
Canada |
Jakarta (Indonesia)2 |
Kazakhstan |
Canada |
Cyprus1 |
Israel |
Mexico |
Chile |
Czech Republic |
Lithuania |
Peru |
Croatia |
Denmark |
New Zealand |
United States |
Czech Republic |
England (UK) |
Singapore |
Denmark |
|
Estonia |
Slovenia |
England (UK) |
|
Finland |
Turkey |
Estonia |
|
Flanders (Belgium) |
Finland |
||
France |
Flanders (Belgium) |
||
Germany |
France |
||
Ireland |
Germany |
||
Italy |
Hungary |
||
Japan |
Ireland |
||
Korea |
Israel |
||
Netherlands |
Italy |
||
Northern Ireland (UK) |
Japan |
||
Norway |
Korea |
||
Poland |
Latvia |
||
Russian Federation |
Lithuania |
||
Slovak Republic |
Netherlands |
||
Spain |
New Zealand |
||
Sweden |
Norway |
||
United States3 |
Poland |
||
Portugal |
|||
Russian Federation |
|||
Singapore |
|||
Spain |
|||
Sweden |
|||
Switzerland |
|||
United States |
1. Note by Turkey:
The information in this document with reference to “Cyprus” relates to the southern part of the Island. There is no single authority representing both Turkish and Greek Cypriot people on the Island. Turkey recognises the Turkish Republic of Northern Cyprus (TRNC). Until a lasting and equitable solution is found within the context of the United Nations, Turkey shall preserve its position concerning the “Cyprus issue”.
Note by all the European Union Member States of the OECD and the European Union:
The Republic of Cyprus is recognised by all members of the United Nations with the exception of Turkey. The information in this document relates to the area under the effective control of the Government of the Republic of Cyprus.
2. Indonesia’s data was subsequently withdrawn.
3. The United States also collected data as part of a PIAAC National Supplement in 2013-14. This included representative samples of a) unemployed adults (aged 16-65); b) young adults (aged 16-34) and c) older adults (aged 66-74). See Krenzke et al. (2019[1]) for details.
The basic specifications for the design of PIAAC (common across the two cycles of the study) are summarised in Table 1.2. More details regarding Cycle 1 of the study can be found in PIAAC (2014[2]).
Table 1.2. Key features of the PIAAC study design
Target population |
Non-institutionalised adults aged 16-65 years normally resident in the national territory of the participating country. |
Sample frame |
The sample frame should cover the target population. Exclusions of up to 5% of the target population permitted. |
Sample design |
Probability-based sample with each individual in the target population having a known probability of selection. |
Sample size |
Minimum sample size of 5 000 completed cases per reporting language. |
Data collection method |
Computer-aided personal interview and self-completed assessment under the supervision of the interviewer. |
Mode of assessment |
Computer (Cycle 1) and tablet (Cycle 2) delivered assessment with a paper-based alternative for respondents with insufficient experience of the use of digital devices. |
Quality assurance and quality control |
Central review of key elements of the study such as sampling, translation and adaptation of instruments. Monitoring of data collection. Data adjudication based on indicators of data quality. |
Instrumentation
As noted above, respondents complete both a background questionnaire and a skills assessment.
The background questionnaire in PIAAC Cycle 2 will consist of 11 modules collecting information on demographic characteristics, social and language background, education, labour-force participation, employment, the task composition of the respondent’s job, literacy and numeracy practices and personality traits.4
The direct assessment involves the following components:
a locator test
an assessment of reading and numeracy components
assessments of literacy, numeracy and adaptive problem solving.
The locator test consists of eight literacy and eight numeracy items of low difficulty. It is designed to provide an initial estimate of the proficiency of the respondent. This is used to direct the respondent to the testing pathway appropriate to his/her proficiency (see below).
The reading and numeracy components assessment consists of set of items assessing:
the ability to understand the meaning of simple sentences and to read and understand short passages fluently (reading)
understanding basic notions of quantity and magnitude (numeracy).
The assessments of literacy, numeracy and adaptive problem solving each consist of around 80 items. Any individual respondent is administered test items covering only two of the three domains and in each of these domains he or she is presented a subset of the test items. In all three domains, the assessments use an adaptive design. The goal is to maximise the efficiency and precision of the assessment by presenting respondents with test items that are neither too easy nor too difficult for them.
In each domain, the assessment consists of a set of units in which each unit is made up of one or more stimuli (e.g. a description of a problem situation, a text, a table – see Figure 1.2 below) and a set of questions or tasks. These units are combined into groups called ‘testlets’ with different average levels of difficulty. The testlets are presented to respondents in two stages. Information from the background questionnaire, the component measures and the locator are used to assign a testlet that is most appropriate for the respondent at Stage 1. The respondent’s performance on the Stage 1 testlet is automatically scored. The test application then assigns a second testlet to the respondent based on his/her performance on the first. While all respondents have a small probability of being allocated any testlet, they have a greater probability of being allocated a testlet closer to their estimated proficiency. For example, at each stage in the assessment, a respondent of high estimated ability has a greater chance of being allocated a testlet of high average difficulty than does a respondent with lower estimated proficiency.
The design for the main study in PIAAC Cycle 2 is presented in Figure 1.1 below. The background questionnaire is administered in CAPI mode by the interviewer and is estimated to take 20-45 minutes to complete depending on the situation of the respondent (with an average of around 30 minutes). The direct assessment is completed by the respondent on a tablet device supplied by the interviewer. The average time for completion of the assessment is estimated to be 60 minutes. However, as PIAAC is not a timed assessment, actual completion times are expected to vary widely.
Respondents undertake the assessment in the following sequence:
The interviewer administers the background questionnaire. The background questionnaire is answered by all respondents and includes a set of questions dealing with the familiarity of the respondent with electronic devices.
After agreeing to continue with the survey, the respondent is handed the tablet device on which he/she completes the assessment. The interviewer demonstrates the basic skills required to complete the direct assessment tasks, e.g., tapping, using drag and drop, and highlighting.
Respondents then complete a tutorial in which they perform each of the skills independently.
The respondent then completes the locator test.
Depending on their responses to relevant background items and their performance on locator tasks, respondents are directed to one of three paths:
Respondents who ‘fail’ the locator follow path 1 and receive the reading and numeracy components only.
Respondents who ‘pass’ the locator, but perform relatively poorly, follow path 2 and receive the components plus the two-stage adaptive modules of literacy, numeracy, or adaptive problem solving (APS).
Respondents who perform well on the locator test follow path 3. A quarter of these respondents are randomly assigned to the reading and numeracy components assessments before moving on to the two-stage adaptive modules of literacy, numeracy, or adaptive problem solving (APS), while the other 75% of respondents proceed directly to the two-stage cognitive modules.
The assessment tasks in PIAAC consist of 1) a set of instructions and a question or task statement that defines what the respondent must do to complete the task, 2) stimulus material (e.g. texts, graphic representations, simulated websites) with which the respondent must interact to complete the task and 3) a means of registering a response. All items in the assessment have the same format. The instructions to the respondent and the task question/statement together with forward and back arrows and access to help are on the left-hand side of the screen with the stimulus materials(s) on the right. Responses are recorded on the left-hand side as in the sample item below or through interaction with the stimulus material. Figure 1.2 provides an example of a PIAAC computer-based test item.
The response modes used in PIAAC Cycle 1 were numeric entry, clicking on multiple choice check boxes, radio buttons and pull-down menus (left-hand side of the screen), and highlighting or clicking on elements in the stimulus material – text, graphic element, links (in simulated we environments) and check boxes (right-hand side) [see OECD (2019[3]), Section 5.2.1)]. In PIAAC Cycle 2, similar response modes will be used with the interaction with the test application interface being via the use of a stylus or tapping with fingers. A simulated calculator will be used for numeric entry. No constructed responses are used in PIAAC.
Assessment frameworks
In large-scale international assessments, the constructs measured are usually described by an assessment framework.5 The framework has a dual purpose: 1) to guide the development of the items (tasks) used to assess the skill in question and 2) to guide the interpretation of the results of the assessment. To this end, the framework provides a definition and detailed description of the features of the construct assessed. In addition, it outlines the recommended approach to the assessment of the skill in question and identifies (e.g. the recommended coverage of the various aspects or dimensions of the construct) and discusses other matters relevant to test development such as the factors that affect the difficulty of items.
Table 1.3. Main features of the assessment frameworks for PIAAC Cycle 2
Literacy |
Numeracy |
Adaptive Problem Solving |
|
---|---|---|---|
Definition |
Literacy is accessing, understanding, evaluating and reflecting on written texts in order to achieve one’s goals, to develop one’s knowledge and potential and to participate in society. |
Numeracy is accessing, using and reasoning critically with mathematical content, information and ideas represented in multiple ways in order to engage in and manage the mathematical demands of a range of situations in adult life. |
Adaptive problem solving involves the capacity to achieve one’s goals in a dynamic situation, in which a method for solution is not immediately available. It requires engaging in cognitive and metacognitive processes to define the problem, search for information, and apply a solution in a variety of information environments and contexts. |
Cognitive processes |
|
|
|
Content |
Texts characterised by their:
|
Mathematical content information and ideas
Mathematical representations
|
Task dimensions
|
Contexts |
|
|
|
In PIAAC, the skills assessed are described in terms of 1) a broad definition, and 2) the dimensions of:
Cognitive processes: the mental processes that form part of the skill in question.
Content: the artefacts, knowledge, representations, situations that constitute the ‘object(s)’ to which these cognitive processes are applied.
Contexts: the settings in which the skill is used.
The main components of the assessment frameworks for PIAAC Cycle 2 are summarised in Table 1.3.
Some of the key implications for the assessment of these skills arising from the frameworks are briefly discussed below.
Coverage of the constructs
In order for the assessment to represent the construct adequately, the set of tasks that constitute the assessment must include tasks designed to cover the range of cognitive processes, type of content and contexts identified by the framework. To this end, each of the framework documents proposes a desirable distribution of tasks across the different dimensions of the framework.
Factors affecting the difficulty of assessment tasks
The PIAAC assessment is intended to measure the entire range of proficiency in the skills of interest that exists in the adult population – from very low to very high. The adult population in participating countries includes individuals who have completed no more than primary education as well as adults who have completed post-doctoral studies. In addition, in countries with relatively high levels of immigration, a substantial proportion of the population may have limited proficiency in the language or languages in which the assessment is delivered.6
The frameworks identify the factors that affect task difficulty and can be manipulated to ensure that tasks covering the full spectrum from very easy to very difficult are included in the assessments. In broad terms, these can be categorised as encompassing features of:
the task statement (e.g. the instructions provided to test-takers, the explicitness of the presentation and definition of the task to be completed)
the stimulus material (e.g. its complexity, length, organisation)
the interaction of task and stimulus (e.g. the presence of distracting/irrelevant material, the number of operations/steps required to be undertaken to successfully complete the task).
Authenticity of tasks
The skills assessed in PIAAC are primarily conceived as skills that enable adults to engage and function effectively in social and economic life and perform the range of tasks required in their various social roles. In line with this focus, assessment tasks are intended to represent the types of reading, mathematical and problem solving demands and situations that the generality of adults face in their everyday lives. In the words of the numeracy framework document: ‘PIAAC is interested in the ability of individuals to cope with tasks that are embedded in the real world, rather than assessing decontextualised mathematical tasks’. Stimulus materials (e.g. the texts that respondents must read, the presentations and representations of numerical and mathematical information and problem situations to which they must respond) represent the kinds of texts, mathematical information and problems that adults encounter in ‘real-world’ situations. Regarding the stimulus material used in literacy tasks, for example:
Many of them are directly drawn from authentic materials with little, if any adaptation. This means that no effort is made to make these texts easier to read or to improve their organisation or presentation. Using naturalistic texts, sometimes even clearly suboptimal ones (for instance, poorly organised or using complex language), ensures a high level of face validity. However, no artificial difficulty or flaw is introduced at the time of test design. (see literacy framework)
Content appropriate to the entire adult population
As PIAAC is an assessment of the entire adults aged 16-65 years, the assessment tasks do not assume highly technical or occupation-specific knowledge. In addition, they do not assume knowledge or skills relevant in formal educational settings such as the use of formal mathematical notation and symbolisation. This reflects the fact that there are countries in which a significant proportion of adults (especially older adults) have very low educational attainment and, more importantly, the reality that most adults left the formal education system long ago. In the case of adults aged 55-65 years, for example, most will have completed their education between 40-50 years ago.
Assessment at low skill levels: Reading and numeracy components
One of the challenges in the assessment of the information-processing skills of adults is to gain information regarding the skills of adults with low proficiency. Low skills are manifested through the inability of a test-taker to successfully complete most tasks in the assessment. In other words, for this group, a lot is known about what they cannot do and little about what they can do.
To provide more information regarding the skills of low-skilled readers, an assessment of reading component skills was introduced in PIAAC Cycle 1 (Sabatini and Bruce, 2009[4]). This covered three skills: print vocabulary, sentence processing and passage fluency. Print vocabulary assessed basic vocabulary knowledge, sentence processing evaluated the ability to understand the semantic logic of simple sentences, and passage fluency assessed the capacity to understand passages of text. Reading components will continue to be assessed in PIAAC Cycle 2 with some modifications. Only two skills (sentence processing and passage fluency) will be assessed.
An assessment of numeracy components has been developed and will be administered as part of PIAAC Cycle 2. This involves two types of tasks designed to measure number sense: 1) identifying how many objects are displayed in photographs of real-life items, and 2) selecting the biggest number from a set of four choices.
No components measures have been developed in the domain of APS. The experience with previous assessments of problem solving has been that a reasonable level of proficiency in literacy and numeracy is a precondition for the successful completion of problem solving items. This is expected to be true also for the assessment of APS. As can be seen from the presentation of the drivers of task difficulty in APS (APS Framework, Table 4.A1.1), even simple problems have a level of complexity and difficulty far in excess of the type of tasks forming the literacy and numeracy components measures.
The evolution of assessment frameworks in international adult assessments
As noted above, PIAAC Cycle 2 is the latest in a series of related international assessments of adults. Table 1.4 presents the domains assessed in each successive study from IALS to PIAAC Cycle 2. The domains in which results are psychometrically linked and can be compared over time are indicated by shading.
The assessment frameworks in each of broad domains assessed in adult skills assessments have evolved considerably since IALS was conducted in the mid- to late-1990s. This is most obvious in the domain of problem solving where different (albeit related) constructs were measured in ALL and PIAAC Cycles 1 and 2 and that of managing numerical and mathematical information where the construct of numeracy was introduced in ALL in place of that of quantitative literacy. However, even within the domains of reading and of numeracy, there has been considerable change in the conceptualisation of the constructs between assessments. These changes are briefly described below and summarised in Tables 1.A.1-1.A.3 in Annex 1.A.
Table 1.4. Domains assessed in IALS, ALL and PIAAC
Domains assessed |
|||||||
---|---|---|---|---|---|---|---|
Reading |
Managing numerical and mathematical information |
Problem solving |
|||||
IALS |
Prose Literacy |
Document Literacy |
Quantitative Literacy |
||||
ALL |
Prose Literacy |
Document Literacy |
Numeracy |
Analytic Problem Solving |
|||
PIAAC Cycle 1 |
Literacy + Literacy Components |
Numeracy |
Problem Solving in Technology-Rich Environment |
||||
PIAAC Cycle 2 |
Literacy + Literacy Components |
Numeracy + Numeracy Components |
Adaptive Problem Solving |
Understanding the evolution of the assessment frameworks and, therefore, of the constructs measured is important for the interpretation of the distributions of skills observed both within and between assessments. The link between the most recent and the older assessments becomes more attenuated over time as the constructs continue to evolve. While the different international adult assessments have been designed to be linked psychometrically in the domains of literacy (IALS and its successors) and numeracy (ALL and its successors), the constructs measured have undergone considerable revision and extension even if a common core remains. Literacy as it will be measured in PIAAC Cycle 2 in 2022-23 is not exactly the same as literacy as measured in PIAAC Cycle 1, ALL and IALS, and the same is true for numeracy. In particular, although IALS and ALL recognised the growing importance of electronic texts, those two earlier assessments were delivered only on paper. Starting with PIAAC Cycle 1, the assessment moved to computer delivery which provided a means to include various types of electronic texts and materials.
The evolution of the assessment frameworks in large-scale assessments (including adult assessments) is the outcome of competing demands: on the one hand, the desire for continuity in measures (to provide reliable measures of change over time) and, on the other, the need for measures to be relevant to contemporary realities and understandings of the phenomena measured. Three main factors push in the direction of change: developments in the understanding of the skills measured, technological and social developments that affect the nature and practice of these skills in everyday life, work and study and technological and methodological advances in the science and practice of measurement.7
The assessment of problem solving provides a particular illustration of the impact of the forces that lead to change in large-scale assessment. Of the domains assessed in PIAAC and its predecessors, it is the one in which the impact of the introduction of computer-based testing has been greatest as it opened up possibilities for its assessment that did not exist in a world of paper-based tests. In addition, the demand for measures of problem solving that speak to current understandings of the phenomenon has been evident in the changes in the points of view from which the assessment of problem solving have been approached over time.
As in any area of scientific endeavour, the understanding of the skills assessed in large-scale assessments changes over time. This is a consequence of theoretical developments as well as reflection on the outcomes of empirical research including the results of large-scale assessments themselves. Comprehensive discussions of the theoretical and conceptual considerations that led to the development of the assessment of APS and to the substantial revision of the numeracy assessment framework in Cycle 2 of PIAAC, can be found in Greiff et al. (2017[5]) for APS and Tout et al. (2017[6]) and Tout (2020[7]) for numeracy as well as in framework documents included in this publication.
The nature of skills such as literacy, numeracy and problem solving has changed in many ways since the early 1990s. Information and communications technologies have altered what it is to read, engage with numerical and mathematical information and solve problems by changing the ways in which information is accessed, communicated and analysed and transformed. For example, print-based texts and representations constituted the source of much of the information accessed by adults in the mid-1990s. At the start of the third decade of the 21st century, electronic texts and representations accessed through digital devices (e.g. computers, tablets, and smartphones) and applications (e.g. web browsers, hypertext, pdf and html files) have become primary sources of information. This has involved the appearance of new types of texts and representations; new forms of navigation within and between texts and representations (scrolling, clicking on icons or radio buttons, hyperlinks); and new tools for the processing and communication of information and increased interlinkages between texts, documents and representations (hypertext, strings of related texts). In addition, on-line service delivery has increased the information-processing demands on adults through the reduction (or removal) of the role of intermediaries in providing access to information and assistance with decision making in many domains (e.g. health, finances and travel).
ICTs have also transformed assessment. The introduction of computer-based assessment (CBA) has had a major impact on the design, delivery and processing of assessments and on the quality, amount and complexity of the resulting data. It has made possible the assessment of proficiency in the digital dimensions of information-processing skills (e.g. the reading of electronic texts, interaction with digital tools presenting and transforming mathematical information, the use of ICT applications to access and transform information to solve problems). It has also enabled the development of more complex assessment tasks. For example, digital assessment platforms make it possible to design tasks that are iterative in nature, and in which not all information is given as part of the initial conditions, as well as tasks involving complex displays of information, modelling and exploration of variation in a range of parameters. This is particularly important in the assessment of problem solving. The introduction of CBA has also permitted the implementation of more complex and efficient test designs (e.g. adaptive testing) as well as features such as automatic scoring. It has also allowed the development of more efficient and timely quality assurance and control procedures and considerably increased the possibilities of identifying data fabrication and fraud. The availability of log-files in which interactions between test-takers and the testing application are captured and stored has provided a new and rich source of data for analysts and test developers interested in understanding test-taking behaviour.8
The introduction of CBA as the default assessment mode in PIAAC Cycle 1 constituted one of the major factors influencing the evolution of the assessment frameworks of adult skills assessments between IALS and PIAAC. This made it possible for PIAAC to 1) reflect the changes in the practices of reading, managing mathematical and numerical information and problem solving brought about by the diffusion of digital tools and media in the way it assessed these skills and 2) use much more efficient test designs for adults.
Box 1.1. Assessment frameworks for previous assessments of adult literacy
Presentations of the assessment frameworks for IALS, ALL and PIAAC Cycle 1 can be found in the following documents:
IALS
Murray, S., I. Kirsch and L. Jenkins (eds) (1998[8]), Adult Literacy in OECD Countries: Technical Report on the First International Adult Literacy Survey, National Center for Education Statistics, Office of Educational Research and Improvement, Washington, DC.
OECD/Statistics Canada (2000[9]), Literacy in the Information Age: Final Report of the International Adult Literacy Survey, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264181762-en.
ALL
Murray, S., Y. Clermont and M. Binkley (eds) (2005[10]), Measuring Adult Literacy and Life Skills: New Frameworks for Assessment, Statistics Canada, Ottawa, Catalogue No. 89-552-MIE, No. 13.
PIAAC Cycle 1
OECD (2012[11]), Literacy, Numeracy and Problem Solving in Technology-Rich Environments: Framework for the OECD Survey of Adult Skills, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264128859-en.
PIAAC Expert Group in Problem Solving in Technology-Rich Environments (2009[12]), “PIAAC Problem Solving in Technology-Rich Environments: A Conceptual Framework”, OECD Education Working Papers, No. 36, OECD Publishing, Paris, http://dx.doi.org/10.1787/220262483674.
PIAAC Literacy Expert Group (2009[13]), “PIAAC Literacy: A Conceptual Framework”, OECD Education Working Papers, No. 34, OECD Publishing, Paris, http://dx.doi.org/10.1787/220348414075.
PIAAC Numeracy Expert Group (2009[14]), “PIAAC Numeracy: A Conceptual Framework”, OECD Education Working Papers, No. 35, OECD Publishing, Paris, http://dx.doi.org/10.1787/220337421165.
Sabatini and Bruce (2009[4]), “PIAAC Reading Component: A Conceptual Framework”, OECD Education Working Papers, No. 33, OECD Publishing, Paris, http://dx.doi.org/10.1787/220367414132.
Developments in literacy
The evolution of the constructs of literacy from IALS to PIAAC Cycle 2 has occurred in four main areas: 1) a reduction of the number of separate domains of literacy assessed, 2) the expansion of the range of text types covered in the assessment, 3) an increasing emphasis placed on evaluation and evaluating metacognition as cognitive strategies required for effective reading and 4) the disentangling of the description and specification of cognitive strategies from questions of task difficulty.9
In IALS, three separate domains of literacy were assessed and represented by separate scales: prose, document and quantitative literacy (Murray, Kirsch and Jenkins, 1998[8]). Prose literacy covered the reading of continuous texts or texts organised in paragraphs. Document literacy covered the reading of written information presented in matrix formats (e.g. tables and lists). Quantitative literacy represented the knowledge and skills required to apply arithmetic operations to numbers embedded in printed materials. ALL continued to assess prose and document literacy as separate domains (Murray, Clermont and Binkley, 2005[10]). However, the assessment of quantitative literacy was dropped in ALL and replaced by the assessment of numeracy (see below). The construct of ‘literacy’ as a single domain was introduced in PIAAC Cycle 1.
‘Literacy’ as defined in PIAAC Cycle 1 represented a global construct that no longer differentiated between the reading of prose and document texts. The other major (and probably the most significant) development was the expansion of the range of texts covered by the assessment to include digital or electronic texts.10 In PIAAC Cycle 2, the classification of texts has been revised to include the dimensions of organisation (density of content, representations and access devices) and source (single or multiple authors/publishers) to better represent the universe of texts accessible in digital environments, including the interactive texts typical of the Web 2.0.
The conceptualisation of the cognitive strategies brought into play by competent readers has also evolved between assessments. In IALS/ALL, the cognitive strategies were conceived in terms of the ‘matching’ of information in the question (the given information) to the information in the stimulus text to respond correctly to a question or directive. These ‘matching’ strategies included the identification of pieces of information in the text (locating/cycling), connecting different parts of the text (integrating), and developing some understanding of the text as a whole (generating). In PIAAC Cycle 1, ‘evaluation and reflection’ (the making of judgements regarding aspects of the text such as truthfulness, relevance and quality) was added as a cognitive strategy required of competent readers. The dimension of evaluation has been further emphasised in Cycle 2 where it is conceived in terms of the evaluation of the accuracy, soundness, and task relevance of a text in relation to both its source and content.
There has also been a gradual separation of the identification and description of cognitive processes involved in literacy from the description of the factors that make assessment tasks more or less difficult. In IALS/ALL, matching strategies were treated as one of the three main factors determining task difficulty, the second being the type of information requested by the question and the third, the plausibility of distractors (the presence of other information in the stimulus text that could distract the test-taker’s attention from the information needed to answer the question) (Murray, Clermont and Binkley, 2005, pp. 101-103[10]). The Cycle 2 framework treats cognitive strategies and the factors affecting task difficulty independently. Task difficulty is conceived as being driven by the features of the stimulus text(s), the formulation of the question/task description and the interaction of the text and question/task description (see literacy framework, Table 2.5).
The assessment of reading components was another new element of the assessment of literacy introduced in PIAAC Cycle 1 (Sabatini and Bruce, 2009[4]) to provide more detailed information about adults with poor literacy skills. Reading components were defined as the basic set of decoding skills essential for extracting meaning from written texts: knowledge of vocabulary (word recognition), the ability to process meaning at the level of the sentence, and fluency in reading passages of text. In PIAAC Cycle 2, the assessment of reading components will be continued but cover only the sentence meaning and passage fluency dimensions. Performance on the reading components tasks will also be integrated as part of the literacy proficiency scale in Cycle 2,11 adding precision to its lower end.
Developments in numeracy
The measurement of ‘numeracy’ was introduced in ALL. This replaced the assessment of ‘quantitative literacy’ conducted in IALS. The rationale for the development of an assessment of numeracy was that the assessments of quantitative and document literacy represented ‘only a subset of the much wider range of tasks and responses that are typical of many every day and work tasks’ (Murray, Clermont and Binkley, 2005, p. 148[10]) relating to the engagement with mathematical information. In particular, key aspects of mathematical information such as measurements and shapes as well as information in formats that did require comprehension of text were not covered. The construct of ‘numeracy’ was developed to more comprehensively cover the mathematical knowledge and skills relevant in work and the everyday life of adults.
[Numeracy’s] key concepts relate in a broad way to situation management and to a range of effective responses (not only to application of arithmetical skills). It refers to a wide range of skills and knowledge (not only to computational operations) and to a wide range of situations that present actors with mathematical information of different types (not only those involving numbers embedded in printed materials). (Murray, Clermont and Binkley, 2005, p. 151[10])
In contrast with the domain of literacy, only minor changes to the specification of the numeracy domain were made in PIAAC Cycle 1 compared to ALL. These concerned presentation more than content. One of the major drivers for the revision of the numeracy assessment framework for PIAAC Cycle 2 was the view that the assessment of numeracy in the 21st century had to be expanded to cover the engagement with mathematical information in digital environments as well as to increase use of the possibilities offered by CBA.12 The revised framework reflects the importance of digital information, representations, devices and applications as realities that adults have to deal with in responding to the numerical demands of everyday life. To this end, the content dimension of the numeracy framework has been significantly updated to include representations of mathematical information in the form of ‘structured information’ (infographics, etc.) and also ‘dynamic applications’ (e.g. online interactive websites and applications alongside more standard software applications and tools). The dimension of cognitive processes has also been revised to emphasise the ability to recognise and identify how and when to use mathematics; to be able to understand, use and apply mathematical concepts and procedures; and the capacity to use strategic, reasoning and reflective skills when using and applying mathematics.
In PIAAC Cycle 2, the assessment of numeracy will be accompanied by an assessment of ‘numeracy components’. As for literacy, the numeracy components assessment focuses on some of the skills essential for achieving automaticity and fluency in managing mathematical and numerical information. The focus is on ‘number sense’ defined as ‘the sense of quantities and the sense of how numbers represent quantities’ (see numeracy framework). The items to be used will be of two types: items relating to quantities (using the stem ‘How many…?’) and items relating to relative magnitudes (‘The biggest?’).
Developments in problem solving
Problem solving represents the domain in which the changes in the conceptualisation of the skill in question have been greatest.13 This is one of the reasons why the assessments of problem solving have not been linked across assessments. An assessment of problem solving was first undertaken in ALL, based on the construct of ‘analytical problem solving’ (Murray, Clermont and Binkley, 2005[10]) and assessed in paper-based format. This was replaced with the assessment of ‘problem solving in technology-rich environments’ (PS-TRE) in PIAAC Cycle 1 which has been replaced, in its turn, by adaptive problem solving (APS) in PIAAC Cycle 2.
Analytical problem solving in ALL focused on the generic aspects of the process of problem solving understood as ‘goal-directed thinking and action in situations for which no routine solution procedure is available’ (Murray, Clermont and Binkley, 2005, p. 197[10]), in particular the steps of:
identifying a problem
searching for relevant information and integrating it into a coherent problem representation
evaluating the problem situation with respect to given goals and criteria
devising a plan for the solution – i.e. an ordered sequence of appropriate actions
monitoring its execution.
The assessment of problem solving in ALL was a paper-based assessment involving static problems in which all necessary information was provided up front. The limitations of this approach were explicitly acknowledged. In particular, computer simulated tasks were seen as the only way to address the dynamic aspects of task regulation (continuous processing of incoming information, coping with processes that cannot be influenced directly, coping with feedback and critical incidents).
In Cycle 1 of PIAAC, the assessment of problem solving moved to CBA mode in the form of the assessment of PS-TRE. PS-TRE represented a hybrid construct, at the intersection of the capacity to use information and communication technologies (ICTs) on the one hand, and of the ability to solve problems on the other. This was reflected in the restriction of the domain of problems covered to that of ‘information problems’ – problems that involved interaction with digital devices and applications (PIAAC Expert Group in Problem Solving in Technology-Rich Environments, 2009, pp. 8-9[12]):
The problem is primarily a consequence of the availability of new technologies.
The solution to the problem requires the use of computer-based artefacts (applications, representational formats, computational procedures).
The problems are related to technology-rich environments themselves (e.g. how to operate a computer, how to fix a settings problem, how to use an Internet browser).
The focus on the assessment of problems in digital environments constituted both the strength and the weakness of PS-TRE. By design, only test-takers who had some (basic) ICT skills could display proficiency in this domain. Non-response for reasons of lack of familiarity with ICT devices or poor computer skills was construct relevant and could be interpreted as lack of proficiency. The downside was that a sizeable proportion (between 8 to 57%) of respondents in all participating countries did not take the assessment at all as they either lacked familiarity with computers or did not wish to undertake PIAAC on a laptop14 (OECD, 2019[15]). This created difficulties in comparisons of results between participating countries15 and meant that there was a considerable gap in the knowledge regarding the problem solving skills per se of the adult population.
APS, as conceptualised for PIAAC Cycle 2, represents the return to a concept of general problem solving that is relevant to a range of information environments and contexts and is not limited to digitally embedded problems even though digital aspects as a mode of problem solving play an important role in APS. What differentiates it from analytical problem solving as assessed in ALL is its focus on the dynamic and adaptive aspects of problem solving – the capacity to react to unforeseen changes and new information that emerge during the process – and on metacognition – the capacity to reflect on the process of problem solving as it takes place (monitoring progress, adjusting goals and strategies in the light of new information and changes in the problem situation).
Relationship of the PIAAC and PISA assessments
In addition to PIAAC, the OECD manages the Programme of International Student Assessment (PISA), an assessment of 15-year-old school students that has been administered every three years since 2000. In each assessment cycle, PISA assesses skills in three core domains (reading literacy, mathematical literacy and scientific literacy) as well as an additional domain unique to each cycle. Assessments of problem solving were administered as the additional domain in 2003, 2012 and 2015.
While similar skills are assessed in PIAAC and PISA in the domains of literacy/reading literacy, numeracy/mathematical literacy and problem solving, the two studies have followed separate development paths and have not been designed to be linked psychometrically. The measurement scales in related domains (e.g. literacy/reading literacy) are independent and the assessments have no items in common.16 This reflects a degree of path dependency (PIAAC is designed to be linked to IALS and ALL) as well as the fact that the two assessments have different target populations.
At the same time, PIAAC and PISA share much at a conceptual level. They belong to the same measurement tradition, share a similar approach to the conceptualisation and definition of the constructs that they measure and a similar assessment methodology. In addition, there have been many experts who have worked on both studies. Reviewing the relationship between the assessment of numeracy in PIAAC and the assessment of mathematical literacy in PISA, Gal and Tout (2014, p. 52[16]) conclude that:
Both assessments of numeracy in PIAAC and mathematical literacy in PISA appear to have substantial conceptual similarities and quite a few practical commonalities in the nature of their test items and their design principles, as well as the range of content areas and skills they cover. The two surveys are highly consistent in their descriptions and structures for contexts and real world content classifications, along with how they describe the types and breadth of responses and actions expected of the respondents.
Much the same comments could be made regarding the literacy/reading literacy and problem solving frameworks in both studies [see OECD, 2019 (pp. 91-93[17])].
Over time, there has been considerable mutual influence between adult and student assessments, particularly regarding the conceptualisation and definition of skills in reading and managing mathematical and numeric information. The IALS literacy frameworks were extremely influential on the development of the first PISA reading framework (OECD, 1999[18]) at the end of the 1990s. The adoption in PISA of an approach to the assessment of reading, mathematics and science that focused on the use of these skills in settings outside school owes much to the IALS approach to the assessment of literacy with its emphasis on the role of reading for social functioning. The PISA 2000 reading framework took over the classification of text types developed in IALS, particularly the prose/document distinction. In many ways, PISA could be viewed as an IALS for school students.17 The PISA frameworks have in their turn influenced PIAAC, particularly in the domain of reading/literacy. The single reading scale adopted by PISA prefigured the single PIAAC literacy scale, for example, and the classifications of texts and cognitive processes adopted in PIAAC Cycle 1 reflects that used in PISA.
Reflecting the conceptual links between the two studies, one of the considerations in the development of the assessment frameworks for PIAAC Cycle 2 was to maximise the conceptual and terminological consistency between the PIAAC and PISA frameworks where relevant and appropriate. At the same time frameworks continue to reflect the fact that the PIAAC represents an assessment of adults.
The framework documents
The framework documents included in this volume were each prepared by a dedicated expert group18 over the 2018-19 with the process being managed and coordinated by the PIAAC international contractor led by Education Testing Service (ETS). Members were selected to include experts from different backgrounds and countries. In all groups, some members had also served as members of the groups responsible for the Cycle 1 frameworks, thus ensuring continuity between the cycles and others had also worked on the PISA project in various capacities. While each expert group worked independently, there was close communication between the groups, particularly between the Chairs. In addition, there was overlap in membership with the Chair of the reading group also serving a member of the problem solving group.
In both adaptive problem solving and numeracy, the work of the expert groups built on earlier exploratory work commissioned by the PIAAC Board of Participating Countries (BPC), the steering committee for the PIAAC project. An initial conceptual framework for the assessment of adaptive problem solving was prepared in 2017 (Greiff et al., 2017[5]) as was a review of the PIAAC numeracy framework (Tout et al., 2017[6]).
The framework documents represent a work in progress. They will be updated following the completion of the main study data collection. At this point, the expert groups will review and revise the descriptors for the proficiency levels used to describe the measurement scales in the case of literacy and numeracy and develop the described scale in the case of APS.
References
[16] Gal, I. and D. Tout (2014), “Comparison of PIAAC and PISA Frameworks for Numeracy and Mathematical Literacy”, OECD Education Working Papers, No. 102, OECD Publishing, Paris, https://dx.doi.org/10.1787/5jz3wl63cs6f-en.
[5] Greiff, S. et al. (2017), “Adaptive problem solving: Moving towards a new assessment domain in the second cycle of PIAAC”, OECD Education Working Papers, No. 156, OECD Publishing, Paris, https://dx.doi.org/10.1787/90fde2f4-en.
[19] Keslair, F. (2018), “Interviewers, test-taking conditions and the quality of the PIAAC assessment”, OECD Education Working Papers, No. 191, OECD Publishing, Paris, https://dx.doi.org/10.1787/5babb087-en.
[1] Krenzke, T. et al. (2019), U.S. Program for the International Assessment of Adult Competencies (PIAAC) 2012/2014/2017: Main Study, National Supplement, and PIAAC 2017 Technical Report (NCES 2020042), U.S. Department of Education, National Center for Education Statistics, Washington, DC, https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2020224.
[29] Maehler, D., S. Jakowatz and I. Konradt (2020), PIAAC Bibliography - 2008-2019, (GESIS Papers, 2020/04), GESIS - Leibniz-Institut für Sozialwissenschaften, Köln, https://doi.org/10.21241/ssoar.67732.
[28] Mullis, I. and M. Martin (eds.) (2015), PIRLS 2016 Assessment Frameword, 2nd Edition, TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College and International Association for the Evaluation of Educational Achievement (IEA), https://timssandpirls.bc.edu/pirls2016/framework.html.
[27] Mullis, I. and M. Martin (eds.) (2013), TIMSS 2015 Assessment Frameworks, TIMSS and PIRLS International Study Center, Lynch School of Education, Boston College and International Association for the Evaluation of Educational Achievement (IEA), https://timssandpirls.bc.edu/timss2015/frameworks.html.
[10] Murray, S., Y. Clermont and M. Binkley (eds.) (2005), Measuring Adult Literacy and Life Skills: New Frameworks for Assessment, Catalogue No. 89-552-MIE, No. 13. Statistics Canada, Ottawa.
[8] Murray, S., I. Kirsch and L. Jenkins (eds.) (1998), Adult Literacy in OECD Countries: Technical Report on the First International Adult Literacy Survey, (NCES 98-053), National Center for Education Statistics, Office of Educational Research and Improvement, Washington, DC.
[24] OECD (2019), Beyond Proficiency: Using Log Files to Understand Respondent Behaviour in the Survey of Adult Skills, OECD Skills Studies, OECD Publishing, Paris, https://dx.doi.org/10.1787/0b1414ed-en.
[23] OECD (2019), PISA 2018 Assessment and Analytical Framework, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/b25efab8-en.
[15] OECD (2019), Skills Matter: Additional Results from the Survey of Adult Skills, OECD Skills Studies, OECD Publishing, Paris, https://dx.doi.org/10.1787/1f029d8f-en.
[3] OECD (2019), Technical Report of the Survey of Adult Skills, Third Edition, http://www.oecd.org/skills/piaac/publications/PIAAC_Technical_Report_2019.pdf.
[17] OECD (2019), The Survey of Adult Skills: Reader’s Companion, Third Edition, OECD Skills Studies, OECD Publishing, Paris, https://dx.doi.org/10.1787/f70238c7-en.
[22] OECD (2016), Skills Matter: Further Results from the Survey of Adult Skills, OECD Skills Studies, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264258051-en.
[21] OECD (2013), OECD Skills Outlook 2013: First Results from the Survey of Adult Skills, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264204256-en.
[11] OECD (2012), Literacy, Numeracy and Problem Solving in Technology-Rich Environments: Framework for the OECD Survey of Adult Skills, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264128859-en.
[20] OECD (2002), Reading for Change: Performance and Engagement across Countries: Results from PISA 2000, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264099289-en.
[18] OECD (1999), Measuring Student Knowledge and Skills: A New Framework for Assessment, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264173125-en.
[26] OECD/Statistics Canada (2011), Literacy for Life: Further Results from the Adult Literacy and Life Skills Survey, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264091269-en.
[25] OECD/Statistics Canada (2005), Learning a Living: First Results of the Adult Literacy and Life Skills Survey, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264010390-en.
[9] OECD/Statistics Canada (2000), Literacy in the Information Age: Final Report of the International Adult Literacy Survey, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264181762-en.
[2] PIAAC (2014), PIAAC Technical Standards and Guidelines - June 2014, OECD Publishing, http://www.oecd.org/skills/piaac/PIAAC-NPM(2014_06)PIAAC_Technical_Standards_and_Guidelines.pdf.
[12] PIAAC Expert Group in Problem Solving in Technology-Rich Environments (2009), “PIAAC Problem Solving in Technology-Rich Environments: A Conceptual Framework”, OECD Education Working Papers, No. 36, OECD Publishing, Paris, https://dx.doi.org/10.1787/220262483674.
[13] PIAAC Literacy Expert Group (2009), “PIAAC Literacy: A Conceptual Framework”, OECD Education Working Papers, No. 34, OECD Publishing, Paris, https://dx.doi.org/10.1787/220348414075.
[14] PIAAC Numeracy Expert Group (2009), “PIAAC Numeracy: A Conceptual Framework”, OECD Education Working Papers, No. 35, OECD Publishing, Paris, https://dx.doi.org/10.1787/220337421165.
[4] Sabatini, J. and K. Bruce (2009), “PIAAC Reading Component: A Conceptual Framework”, OECD Education Working Papers, No. 33, OECD Publishing, Paris, https://dx.doi.org/10.1787/220367414132.
[7] Tout, D. (2020), “Evolution of adult numeracy from quantitative literacy to numeracy: Lessons learned from international assessments”, International Review of Education, Vol. 66/2-3, pp. 183-209, http://dx.doi.org/10.1007/s11159-020-09831-4.
[6] Tout, D. et al. (2017), Review of the PIAAC Numeracy Assessment Framework: Final Report, Australian Council for Educational Research, Camberwell, Australia.
[30] Wallin, G. (2018), “New PIAAC study coming up – to measure abilities among adults”, Nordic Labour Journal, http://www.nordiclabourjournal.org/nyheter/news-2018/article.2018-12-14.7343538187.
Annex 1.A. Summary of the evolution of assessment frameworks – from IALS to PIAAC Cycle 2
Annex Table 1.A.1. Literacy (Reading)
IALS/ALL |
PIAAC Cycle 1 |
PIAAC Cycle 2 |
||
---|---|---|---|---|
Construct |
Prose Literacy |
Document Literacy |
Literacy |
Literacy |
Definition |
Literacy is using printed and written information to function in society, to achieve one’s goals, and to develop one’s knowledge and potential. Prose literacy is the knowledge and skills needed to understand and use information from texts, including editorials, news stories, brochures and instruction manuals. |
Literacy is using printed and written information to function in society, to achieve one’s goals, and to develop one’s knowledge and potential. Document literacy is the knowledge and skills required to locate and use information contained in various formats, including job applications, payroll forms, transportation schedules, maps, tables and charts. |
Literacy is the ability to understand, evaluate, use and engage with written texts to participate in society, to achieve one’s goals, and to develop one’s knowledge and potential. Literacy encompasses a range of skills from the decoding of written words and sentences to the comprehension interpretation, and evaluation of complex texts. |
Literacy is accessing, understanding, evaluating and reflecting on written texts in order to achieve one’s goals, to develop one’s knowledge and potential and to participate in society. |
Cognitive processes |
|
|
|
|
Content |
Continuous texts:
|
Non-continuous texts:
|
Texts characterised by their medium (print-based or digital) and by format:
|
Texts characterised by their:
|
Contexts |
|
|
|
|
Factors affecting task difficulty |
|
|
|
|
Assessment mode |
Paper-based |
Computer-based (laptop device) + paper-based option |
Computer-based (tablet device) + paper-based option in a limited number of countries |
Sources: For IALS: Murray, Kirsch and Jenkins (1998[8]). For ALL: Murray, Clermont and Binkley (2005[10]). For PIAAC Cycle 1: (OECD, 2019[17]). For PIAAC Cycle 2: the frameworks included in this volume.
Annex Table 1.A.2. Managing numerical and mathematical information
IALS |
ALL |
PIAAC Cycle 1 |
PIAAC Cycle 2 |
|
---|---|---|---|---|
Construct |
Quantitative Literacy |
Numeracy |
Numeracy |
Numeracy |
Definition |
Quantitative literacy is the knowledge and skills required to apply arithmetic operations, either alone or sequentially, to numbers embedded in printed materials, such as balancing a chequebook, figuring out a tip, completing an order form or determining the amount of interest on a loan from an advertisement. |
Numeracy is the knowledge and skills required to effectively manage and respond to the mathematical demands of diverse situations. Numerate behaviour is observed when people manage a situation or solve a problem in a real context; it involves responding to information about mathematical ideas that may be represented in a range of ways; it requires the activation of a range of enabling knowledge, factors and processes. |
Numeracy is the ability to access, use, interpret and communicate mathematical information and ideas, in order to engage in and manage the mathematical demands of a range of situations in adult life. To this end, numeracy involves managing a situation or solving a problem in a real context, by responding to mathematical content/information/ideas represented in multiple ways. |
Numeracy is accessing, using and reasoning critically with mathematical content, information and ideas represented in multiple ways in order to engage in and manage the mathematical demands of a range of situations in adult life. |
Content |
Non-continuous texts:
|
Mathematical information:
Representations of mathematical information:
|
Mathematical content, information and ideas:
Representations of mathematical content:
|
Mathematical content information and ideas:
Mathematical representations:
|
Cognitive processes |
|
|
|
|
Contexts |
|
|
|
|
Factors affecting task difficulty |
|
|
|
|
Assessment mode |
Paper-based |
Paper-based |
Computer-based (laptop device) + paper-based option |
Computer-based (tablet device) + paper-based option in a limited number of countries |
Sources: For IALS: Murray, Kirsch and Jenkins (1998[8]). For ALL: Murray, Clermont and Binkley (2005[10]). For PIAAC Cycle 1: (OECD, 2019[17]). For PIAAC Cycle 2: the frameworks included in this volume.
Annex Table 1.A.3. Problem solving
ALL |
PIAAC Cycle 1 |
PIAAC Cycle 2 |
|
---|---|---|---|
Construct |
Analytical Problem Solving |
Problem Solving in Technology-Rich Environments |
Adaptive Problem Solving |
Definition |
Problem solving involves goal-directed thinking and action in situations for which no routine solution procedure is available. The problem solver has a more or less well defined goal, but does not immediately know how to reach it. The incongruence of goals and admissible operators constitutes a problem. The understanding of the problem situation and its step-by-step transformation based on planning and reasoning, constitute the process of problem solving. |
Problem solving in technology-rich environments involves the ability to use digital technology, communication tools and networks to acquire and evaluate information, communicate with others and perform practical tasks. The assessment focuses on the abilities to solve problems by setting up appropriate goals and plans, and accessing and making use of information through computers and computer networks. |
Adaptive problem solving involves the capacity to achieve one’s goals in a dynamic situation, in which a method for solution is not immediately available. It requires engaging in cognitive and metacognitive processes to define the problem, search for information, and apply a solution in a variety of information environments and contexts. |
Cognitive processes |
|
|
|
Content |
Problems |
Technology:
Nature of problems:
|
Aspects of problems:
|
Contexts |
Not specified |
|
|
Factors affecting task difficulty |
Not specified |
|
|
Assessment mode |
Paper-based |
Computer-based (laptop device) |
Computer-based (tablet device) |
Sources: For ALL: Murray, Clermont and Binkley (2005[10]). For PIAAC Cycle 1: (OECD, 2019[17]). For PIAAC Cycle 2: the frameworks included in this volume.
Notes
← 1. Results from IALS can be found in OECD/Statistics Canada (2000[9]) and results from ALL in OECD/Statistics Canada (2005[25]; 2011[26]).
← 2. Results have been published in OECD (2013[21]; 2016[22]; 2019[15]). A comprehensive bibliography of publications based on PIAAC over the period 2008 to 2019 is provided in Maehler, Jakowatz and Konradt (2020[29]).
← 3. The PIAAC Technical Standards and Guidelines [(PIAAC, 2014[2]), Guideline 10.4.1] provide that the interview should be completed in the respondent’s home. However, if the respondent prefers, it may be conducted at a neutral location such as a library, community centre or office. On average, across all countries, around 91% of interviews took place in the respondent’s home [see Keslair, 2018 (pp. 11-13[19])]. In a small number of countries, around a third of interviews took place in a location other than the respondent’s residence.
← 4. The background questionnaire used in Cycle 1 of PIAAC can be accessed at: http://www.oecd.org/skills/piaac/BQ_MASTER.HTM. The background questionnaire for Cycle 2 will be largely similar, although it will be improved and updated in a number of dimensions.
← 5. See, for example, the frameworks for PISA (OECD, 2019[23]), TIMSS (Mullis and Martin, 2013[27]) and PIRLS (Mullis and Martin, 2015[28]).
← 6. The assessment is usually delivered in the national language or languages only. In a small number of participating countries, the assessment is also made available in widely spoken minority languages [see Table 4.11 in OECD (2019[17]).
← 7. Tout (2020[7]) offers a comprehensive overview of the changes in the conceptualisation of ‘numeracy’ between IALS and PIAAC Cycle 2. A good discussion of the factors that influence the evolution of assessment frameworks in reading in PISA which is also relevant to PIAAC can be found in OECD (2019, pp. 22-27[23]).
← 8. See OECD (2019[24]) for an exploration of the log-file data derived from PIAAC.
← 9. One aspect of the assessment of literacy has remained constant since IALS in adult assessments is that it has been undertaken as an assessment of reading (of the understanding of and engagement with written texts) and has not included the dimension of writing or the production of text. This represents a pragmatic choice rather than a theoretical position. It is acknowledged that writing represents an important dimension of a broad concept of literacy. However, the challenges of directly assessing proficiency are sufficiently large to make it impractical in large-scale cross-national assessments such as PIAAC.
← 10. As well as text formats common in digital environments (e.g. multiple texts or texts constituted by series of juxtaposed texts).
← 11. Performance in the reading components assessment was reported separately from performance in literacy in PIAAC Cycle 1.
← 12. In the words of the numeracy framework, Cycle 1 numeracy test items were ‘based predominantly around static images and associated responses’ and were ‘more like paper-based assessments transferred onto a computer’ (numeracy framework).
← 13. This is also true of PISA where three separate constructs have been assessed: analytical problem solving (2003), creative problem solving (2012) and collaborative problem solving (2015).
← 14. Paper-based versions of the assessments of literacy and numeracy were available for respondents.
← 15. As a variable proportion of the 16-65 year-old population took the assessment on computer, comparison of mean scores between countries was not possible. Presentation of country differences focusses on the proportion of the population performing at different proficiency levels.
← 16. The exception is the reading assessment in PISA 2000 in which fifteen prose literacy items from IALS were included. The intention was to see whether the results of the two studies could be reported on a common scale. Chapter 8 of (OECD, 2002[20]) discusses the findings of an analysis of the performance of students on the IALS items.
← 17. The description of PIAAC as a ‘PISA for adults’ [see, for example, Wallin (2018[30])] ignores the fact that adult assessments (in the form of IALS) predated PISA and fails to acknowledge the strong influence of IALS on PISA. It is important to note that PISA also owes a considerable debt to the International Evaluation Association (IEA) studies TIMSS and PIRLS which demonstrated the feasibility and utility of large-scale international assessments of school students.
← 18. The members of the expert groups are listed in the acknowlegments.