Loading…
STLHE 2015 has ended
Achieving Harmony: Tuning into Practice
Wednesday, June 17 • 4:00pm - 5:30pm
POSTER.32 - Using easily accessible aggregate LMS and SEoT data to evaluate learning design, learner engagement and perceived course value

Sign up or log in to save this to your schedule and see who's attending!

Learning analytics research may allow higher education to make better use of ‘big educational data’ for learning design (Ferguson, 2012; Lockyer et al., 2013). In this exploratory study we have brought together several sets of data from our institution to examine whether aggregated course-level data can be used to assess the relationships between different elements of course engagement: course and assessment structure, student online activity, and perceived course value. We explored aggregate course-level data (Learning Management System data and course evaluation (SEoT) data (Marsh, 2007)) from 26 online courses, rather than individual learner data, with the goal of discovering approaches that may be generalizable across higher education institutions, while avoiding use of sensitive personal information. Our preliminary results indicate that online courses in which students spend more time on peer interaction activities (mainly the discussion forum) receive higher evaluation scores, while the relationship between time spent on course content pages and perceived value is not as clear. Having an emphasis on effort-based assessments, on the other hand, and organizing course materials into modules, is associated with higher perceived value. This work demonstrates the value of pooled, easily accessible, and anonymous data for high-level inferences regarding learning in online courses. Specifically, our analysis suggests that courses whose activities and assessments are more demanding of learner time are, in fact, associated with increased perceived value, especially when students use their time in the course to interact with peers. Results also show that course structure can contribute to productive interactions, but not as simply as one would think. _x000D_
_x000D_
Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 304–317._x000D_
_x000D_
Marsh, H. W. (2007). Students’ Evaluations of University Teaching: Dimensionality, Reliability, Validity, Potential Biases and Usefulness. In R. P. Perry & J. C. Smart (Eds.) The Scholarship of Teaching and Learning in Higher Education: An Evidence-Based Perspective (pp319-383). Springer Netherlands._x000D_
_x000D_
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439-1459. doi:10.1177/000276421347936

Speakers
avatar for Ido Roll

Ido Roll

Director, Institute for Scholarship of Teaching and Learning, University of British Columbia
Technology can Help folks be eager learners Fruitful in context


Wednesday June 17, 2015 4:00pm - 5:30pm
Bayshore Foyer

Attendees (0)