How Should We Quantify Student Engagement?

Student engagement is widely thought to be a key predictor of student motivation and achievement (Kuh, Cruce, Shoup, Kinzie, & Gonyea, 2008; Pintrich & de Groot, 1990).  We all have a sense of what an engaged student looks like (Ainley, 2012; Christenson, Reschly, & Wylie, 2012; Finn & Zimmer, 2012): they ask questions, they challenge assertions and they complete assignments.  But one of the challenges instructors face is the need to identify students at risk of failure earlier in the semester in order to provide additional support and mentorship and these are often students who are not engaged.  As we are able to digitally measure more and more of how a student participates, to what degree will it be possible to identify a suite of measures that can identify underperforming students earlier in the semester?

To start, what student actions should be included as potential measures of engagement?  Kuh, et al. (2008) define student engagement as:

“Student engagement represents both the time and energy students invest in educationally purposeful activities.”

Unfortunately this doesn’t identify what specific student actions to include in a measure of engagement.  Here a suite of measures made in an introductory science course, Extreme Weather, at the University of Michigan in the winter 2014 semester are considered as measures of participation and compared with student outcomes.

The course studied, AOSS 102, Extreme Weather, is offered for students as one of many courses to fulfill a science distribution requirement.  In the winter semester, 2014, the course was offered as a hybrid course streamed in real time with the option that students could participate in person or remotely.  Students used LectureTools, which allows students to participate synchronously from wherever they are to answer questions, pose questions, annotate slides, take notes and indicate confusion.

LectureTools provides the instructor with tools to ask a wide range of question types including multiple choice, reorder list, free response, numerical and image-based questions, excellent for testing students understanding of graphs, images and maps.

All data for LectureTools is stored in the cloud and is available to the course instructor via a web portal that summarizes student participation on class-by-class basis.  For this study the data collected during the semester was separated into three periods representing student participation between the beginning of the semester and the first exam, between the first exam and second exam, and between the second exam and third.

As others have demonstrated that student outcomes are often well correlated with the incoming Grade Point Average (GPA) of students additional data was obtained from the University of Michigan’s Data Warehouse, which includes student data on demographics and academic achievement.  The data extracted from the data warehouse included gender, GPA, year in school for each student.  All the data were extracted and then linked to students’ hourly exam grades.

This study was limited to one class and one semester and hence is not intended to represent multiple courses at multiple institutions.  Nonetheless it provides a unique suite of measurements that include more parameters than have been previously reported in one set.

The results indicate that:

  • Student outcomes were positively related to extrinsic influences including their incoming grade point average, their interest in the content and their physical and emotional state.
  • Student outcomes were not related to measures of class attendance alone.
  • Student outcomes were not related to the number of times students participated remotely.
  • Student outcomes were positively related to the level of participation in in-class questions.
  • Student outcomes were positively related to how many questions students got right in class.
  • Student outcomes were positively related to the number of slides for which students created notes.

Because student outcomes and their incoming GPA’s were correlated this study reviewed what is it that student’s with different GPA’s tend to do in class that might differentiate their higher level of academic success.  These findings show that students with higher GPA’s exhibited the following characteristics that were different from lower GPA students:

  • Higher GPA students tended to participate in somewhat more questions in class.
  • Higher GPA students tended to get a higher percentage of questions correct in class.
  • Higher GPA students took notes on significantly more slides during the course.
  • Higher GPA students were far more likely to physically come to class than lower GPA students.
  • Lower GPA students were far more likely to participate in class remotely than higher GPA students.
  • Lower GPA students were more likely to miss class.

These results provide new insight into differences between more and less successful students and can help in the design of tailored feedback to future students.  Next steps will be to 1) replicate all or parts of these analyses in other classes using LectureTools or similar technologies that can measure multiple aspects of student participation and 2) design an “engagement measure” for use in this course that identifies students with participation patterns (and possibly past performance measures) that are similar to those with low outcomes from this research.  The engagement measure would be used to identify students early in the semester who could be invited to help design interventions to improve their performance.

Moreover these measures and this report represents a model for how to design instructor dashboards in the future when ever increasing amounts of data become available about student participation.  The steps to vetting what information is actionable and useful and what is interesting but unactionable or unrelated to outcomes must be based on evidence obtained in class.  Future systems will want to give instructors both options for defining what is important in their course and guidance on what has been found in the past and how to perform similar analyses of one’s own course.

Perry Samson
University of Michigan

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s