48988 |
Creator |
e6254423dd69e7a2a2c65ecddd9d3092 |
48988 |
Creator |
988e42a4ad32f24a3f8311a89515bcfc |
48988 |
Creator |
c9d52ef4a11ad1277436a89c20d0a5d4 |
48988 |
Creator |
71122b7715e55a3f53d5c14540adcbbb |
48988 |
Creator |
f037b91553d4d0d207f3e2fe9751bb02 |
48988 |
Date |
2017-11 |
48988 |
Is Part Of |
p07475632 |
48988 |
Is Part Of |
repository |
48988 |
abstract |
Many researchers who study the impact of computer-based assessment (CBA) focus on
the affordances or complexities of CBA approaches in comparison to traditional assessment
methods. This study examines how CBA approaches were configured within and between
modules, and the impact of assessment design on students’ engagement, satisfaction,
and pass rates. The analysis was conducted using a combination of longitudinal visualisations,
correlational analysis, and fixed-effect models on 74 undergraduate modules and their
72,377 students. Our findings indicate that educators designed very different assessment
strategies, which significantly influenced student engagement as measured by time
spent in the virtual learning environment (VLE). Weekly analyses indicated that assessment
activities were balanced with other learning activities, which suggests that educators
tended to aim for a consistent workload when designing assessment strategies. Since
most of the assessments were computer-based, students spent more time on the VLE during
assessment weeks. By controlling for heterogeneity within and between modules, learning
design could explain up to 69% of the variability in students’ time spent on the VLE.
Furthermore, assessment activities were significantly related to pass rates, but no
clear relation with satisfaction was found. Our findings highlight the importance
of CBA and learning design to how students learn online. |
48988 |
authorList |
authors |
48988 |
status |
peerReviewed |
48988 |
uri |
http://data.open.ac.uk/oro/document/589458 |
48988 |
uri |
http://data.open.ac.uk/oro/document/589459 |
48988 |
uri |
http://data.open.ac.uk/oro/document/591885 |
48988 |
uri |
http://data.open.ac.uk/oro/document/591890 |
48988 |
uri |
http://data.open.ac.uk/oro/document/591891 |
48988 |
uri |
http://data.open.ac.uk/oro/document/591892 |
48988 |
uri |
http://data.open.ac.uk/oro/document/591893 |
48988 |
uri |
http://data.open.ac.uk/oro/document/591894 |
48988 |
uri |
http://data.open.ac.uk/oro/document/592719 |
48988 |
uri |
http://data.open.ac.uk/oro/document/652292 |
48988 |
uri |
http://data.open.ac.uk/oro/document/652299 |
48988 |
uri |
http://data.open.ac.uk/oro/document/652300 |
48988 |
uri |
http://data.open.ac.uk/oro/document/652301 |
48988 |
uri |
http://data.open.ac.uk/oro/document/652302 |
48988 |
uri |
http://data.open.ac.uk/oro/document/652303 |
48988 |
uri |
http://data.open.ac.uk/oro/document/661906 |
48988 |
volume |
76 |
48988 |
type |
AcademicArticle |
48988 |
type |
Article |
48988 |
label |
Nguyen, Quan ; Rienties, Bart ; Toetenel, Lisette ; Ferguson, Rebecca and Whitelock,
Denise (2017). Examining the designs of computer-based assessment and its impact
on student engagement, satisfaction, and pass rates. Computers in Human Behavior,
76 pp. 703–714. |
48988 |
label |
Nguyen, Quan ; Rienties, Bart ; Toetenel, Lisette ; Ferguson, Rebecca and Whitelock,
Denise (2017). Examining the designs of computer-based assessment and its impact
on student engagement, satisfaction, and pass rates. Computers in Human Behavior,
76 pp. 703–714. |
48988 |
Title |
Examining the designs of computer-based assessment and its impact on student engagement,
satisfaction, and pass rates |
48988 |
in dataset |
oro |