According to cognitive load theory, instructions can impose three types of cognitive load on the learner: intrinsic load, extraneous load, and germane load. Proper measurement of the different types of cognitive load can help us understand why the effectiveness and efficiency of learning environments may differ as a function of instructional formats and learner characteristics. In this article, we present a ten-item instrument for the measurement of the three types of cognitive load. Principal component analysis on data from a lecture in statistics for PhD students (n = 56) in psychology and health sciences revealed a threecomponent solution, consistent with the types of load that the different items were intended to measure. This solution was confirmed by a confirmatory factor analysis of data from three lectures in statistics for different cohorts of bachelor students in the social and health sciences (ns = 171, 136, and 148), and received further support from a randomized experiment with university freshmen in the health sciences (n = 58).
In two studies, we investigated whether a recently developed psychometric instrument can differentiate intrinsic, extraneous, and germane cognitive load. Study I revealed a similar three-factor solution for language learning (n ¼ 108) and a statistics lecture (n ¼ 174), and statistics exam scores correlated negatively with the factors assumed to represent intrinsic and extraneous cognitive load during the lecture. In Study II, university freshmen who studied applications of Bayes' theorem in exampleeexample (n ¼ 18) or exampleeproblem (n ¼ 18) condition demonstrated better posttest performance than their peers who studied the applications in problemeexample (n ¼ 18) or problemeproblem (n ¼ 20) condition, and a slightly modified version of the aforementioned psychometric instrument could help researchers to differentiate intrinsic and extraneous cognitive load. The findings provide support for a recent reconceptualization of germane cognitive load as referring to the actual working memory resources devoted to dealing with intrinsic cognitive load.
Application of physiological methods, in particular electroencephalography (EEG), offers new and promising approaches to educational psychology research. EEG is identified as a physiological index that can serve as an online, continuous measure of cognitive load detecting subtle fluctuations in instantaneous load, which can help explain effects of instructional interventions when measures of overall cognitive load fail to reflect such differences in cognitive processing. This paper presents a review of seminal literature on the use of continuous EEG to measure cognitive load and describes two case studies on learning from hypertext and multimedia that employed EEG methodology to collect and analyze cognitive load data.
Keywords Electroencephalography . Cognitive load . Educational psychologyResearchers working in the context of cognitive load theory (CLT; Paas et al. 2003a, b;Sweller 1988;Sweller et al. 1998) have been concerned with analyzing the effects of cognitive load on learning and devising strategies and tools to help learners maintain an optimal level of load in various learning contexts. As a consequence, measurement of cognitive load plays a key role in CLT research (Paas et al. 2003a, b). In this article, we will discuss new possibilities for cognitive load measurement offered by neuroscience, focusing in particular on (EEG).
a b s t r a c tFor self-regulated learning to be effective, students need to be able to accurately assess their own performance on a learning task and use this assessment for the selection of a new learning task. Evidence suggests, however, that students have difficulties with accurate self-assessment and task selection, which may explain the poor learning outcomes often found with self-regulated learning. In experiment 1, the hypothesis was investigated and confirmed that observing a human model engaging in self-assessment, task selection, or both could be effective for secondary education students' (N ¼ 80) acquisition of selfassessment and task-selection skills. Experiment 2 investigated and confirmed the hypothesis that secondary education students' (N ¼ 90) acquisition of self-assessment and task-selection skills, either through examples or through practice, would enhance the effectiveness of self-regulated learning. It can be concluded that self-assessment and task-selection skills indeed play an important role in selfregulated learning and that training these skills can significantly increase the amount of knowledge students can gain from self-regulated learning in which they choose their own learning tasks.
This study investigated the amounts of problem-solving process information ("action," "why," "how," and "metacognitive") elicited by means of concurrent, retrospective, and cued retrospective reporting. In a within-participants design, 26 participants completed electrical circuit troubleshooting tasks under different reporting conditions. The method of cued retrospective reporting used the original computer-based task and a superimposed record of the participant's eye fixations and mouse-keyboard operations as a cue for retrospection. Cued retrospective reporting (with the exception of why information) and concurrent reporting (with the exception of metacognitive information) resulted in a higher number of codes on the different types of information than did retrospective reporting.
Kirschner, Sweller, and Clark (2006) suggest that unguided or minimally guided instructional approaches are less effective and efficient for novices than guided instructional approaches because they ignore the structures that constitute human cognitive architecture. While we concur with the authors on this point, we do not agree to their equation of problem-based learning with minimally guided instruction. In this commentary, we argue that problem-based learning is an instructional approach that allows for flexible adaptation of guidance, and that, contrary to Kirschner et al.'s conclusions, its underlying principles are very well compatible with the manner in which our cognitive structures are organized.In a recent article, Kirschner, Sweller, and Clark (2006) assert that unguided or minimally guided instructional approaches are less effective and efficient than guided instructional approaches because they ignore the structures that constitute human cognitive architecture. While we concur with the authors about the failure of minimally guided instruction for novices learning in structured domains, in this commentary we will argue that problem-based learning (PBL) is an instructional approach that cannot be equated with minimally guided instruction. On the contrary, we contend that the elements of PBL allow for flexible adaptation of guidance, making this instructional approach potentially more compatible with the manner in which our cognitive structures are organized than the direct guided instructional approach advocated by Kirschner et al. (2006).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.