Measures of student learning
Measures of student learning can be either direct or indirect. Details and examples of both types of measures are in turn given below.
Measures of student learning can be either direct or indirect. Details and examples of both types of measures are given below.
Direct measures of learning over time
Concept tests, such as the Force Concept Inventory (Hestenes and Halloun, 1995) – available from Mazur (1997) – are widely used in engineering and physics schools across the world to evaluate students’ conceptual understanding. Sample concept tests related to a wide range of science, engineering and mathematics topics are available from:
- The Assessment Instruments Information Page, hosted by Professor Robert Beichner at North Carolina State
Direct measures of learning at a single point in time
- student performance in institutional examinations and assignments can be used, in particular, to demonstrate the positive impact of pedagogical or curricular change as part of a promotion case;
- products/outputs of a course or programme delivered by students, such as final-year projects, conceptual maps or oral exams;
- student performance in standardised tests, capturing both generic learning outcomes through tools such as the Collegiate Learning Assessment (Klein et al., 2007) or capturing discipline-specific capabilities through tools such as AHELO (OECD, 2009). Although such tools are primarily designed for comparisons between institutions and countries, such data could also be disaggregated by programme to support a candidate’s case for promotion.
A wide range of evidence sources was used to demonstrate Dr Forest’s institutional impact and influence in teaching and learning, including:
- Professional activities: the educational portion of the promotion case centred on a description of three activities: (i) the co-foundation of the ‘InVenture Prize’, a university invention competition, (ii) the establishment of the ‘Invention Studio’, an open-access space for student creativity, innovation and design, and (iii) the redesign of an engineering capstone design course.
- Peer assessments: including national press coverage of the educational activities developed by Dr Forest, a peer-reviewed pedagogical publication and details of the funds raised for the establishment of the ‘Invention Studio’.
- Indirect measures of student learning: including estimates of the number of companies founded by students engaged in the entrepreneurial and innovation activities established by Dr Forest.
- Direct measures of student learning: including an evaluation of the quality of student projects from the multi-disciplinary final year design course established by Dr Forest, as described below.
Indirect measures typically relate either to institutional measures of student progression (e.g. pass rates, attrition rates) or to the perspectives of students and other stakeholders (e.g. unsolicited student feedback, student evaluation scores, employer feedback). Examples of indirect measures of students learning are listed below. Where possible, links to relevant measurement tools are provided.
- Student Evaluation of Educational Quality (SEEQ) captures student evaluations of 35 aspects of effective teaching in relation to their course or teacher. A version of the SEEQ questionnaire is reproduced in the appendices of Nash (2012).
- Student Assessment of Learning Gains (SALG) is a survey tool which, according to its authors (Seymour et al., 2000), “avoids critiques of the teacher, the teacher’s performance, and of teaching methods that are unrelated to student estimates of what they have gained from them”, focusing instead on “the learning gains that students perceive they have made” in terms of the learning outcomes of the course or activity.
Targeted self-efficacy questionnaires are also available which often focus on specific skills and attitudes, such as entrepreneurship (Lucas, 2014), or within specific disciplines, such as engineering design (Carberry et al., 2010).
- student attrition/retention rates
- student satisfaction in relation to specific courses, collected via survey and written feedback
- pass rates and degree classifications
- employer assessment of graduate capabilities, collected via survey
- post-graduation employment rates and salary scales
- graduate feedback about their educational experience, collected via survey
- peer-reviewed evidence (such as institutional and national teaching awards, peer-reviewed pedagogical articles and the inclusion of his teaching activities in published case studies of good practice) as indicators of scholarly teaching and pedagogical influence beyond his institution.
- details of a major curricular innovation with associated improvements in student progression following its implementation, as indirect measures of student learning, details for which are given below.
Using both survey and focus-group data, he conducted (i) an analysis of the design and delivery of Engineering Teams, identifying a number of constraints to the scheme that were subsequently improved upon during the years that followed, and (ii) a review of the impact of Engineering Teams on the student cohort. A major indicator of the impact of Engineering Teams, as highlighted in the promotion case, was the significant improvement in student progression rates following its introduction: from 83% to 93%. As Dr Joyce noted, “Going from a situation where we were ‘losing’ almost 1 in 5 students to one in which we were only ‘losing’ 1 in 11 conveyed a very strong message I thought, particularly when there was no additional financial expenditure by the School. These numbers were also backed up by positive student feedback which we gathered over the first year and at the beginning of second year”.
- Assessments by industry partners and/or graduate employers, such as (i) surveys capturing the perceived capabilities of graduates from particular programmes/universities compared to peer institution or previous generations of graduates, or (ii) qualitative assessments of student performance on industry-linked curricular experiences or placements
- Student engagement data, such as that captured through the US National Survey of Student Engagement.