Now that we are moving forward with analytics using LMS data, LIT can offer more information to help you redesign or revise your courses. We have some high-level results from the Bettering Learner Engagement Study (BLES) that have identified practices that seem to aid student success. In addition, we can provide analytics at the course level to help individual instructors, or groups of instructors teaching different sections of the same course, explore their courses and determine what, if any changes they want to consider during course redesign/revision.
Since an example often illustrates the options available, I’ll present some work we’ve done with the Center for Academic Excellence in reviewing their Academic Honesty Workshop in preparation for the next offering this summer.
The backstory, the Academic Honesty Workshop ran for the first time last summer. All deposited first-year students were enrolled (no transfers, readmits, or CPCE students were in the cohort). The students were assigned to one of four sections to complete training on the Institute’s Academic Honesty Policy and to review/learn to cite sources in academic papers.
The analysis began with an initial request for a report on student completion of the workshop sorted by the English course/section they were enrolled in for Fall term so that instructors would know which students in their courses might need additional assistance. This was a reasonably simple report compiled with assistance from the Office of Decision Support.
Being a data geek, I couldn’t resist exploring the data a bit. All the sections were based on a master course – all the content was the same, all the assignments and activities were the same. I noticed that there were differences in completion rates among the sections. Because all the sections were the same, content-wise, the differences had to be related to differences in what the instructors were doing. Diving into the database to look at how much time instructors were in the course – their overall activity – showed nothing that might suggest why the different completion rates. The only difference was the use of announcements. The section with the highest completion rates had greater use of the announcement tool. Since the number of announcements was not great, I could do a quick content analysis. The announcements consisted of messages/feedback that, based on research (Hattie and Timperley, 2007), supports student success. These results suggested changes that could easily be implemented to help all workshop instructors encourage workshop completion, such as providing some boilerplate announcements that instructors can quickly customize and time frames for communication.
In addition to looking at instructor activity differences among the sections, I also looked at student access patterns. Aside from completion rates, there was little difference in student behavior among sections. One pattern that was observed, there was about a 2 week lag between the start of the workshop and when students began accessing the workshop. Sessions were held during new student orientation to introduce the workshop to the students and let them know when they needed to start. The data suggest more or earlier communication may be needed to encourage students to start the workshop in a timely manner. Interestingly, students have accessed the workshop much longer than expected often well into the fall term. This behavior has lead to the decision to leave the course open indefinitely as a resource students can refer back to as needed during their Wentworth career.
We also found that students who complete the workshop, accessed the Blackboard site for at least two weeks, suggesting a minimum time needed to successfully complete it. Suggestions for improvement include more messaging to students on what time commitment is needed to be successful in the workshop.
Finally, we were able to look at the impact of the workshop whether student completing the workshop were later on academic probation. Students who completed the workshop were less likely to be on academic probation in the second semester of their first year than peers enrolled in the workshop who did not complete it.
Through applying analytics to multiple sections of a single course we have been able to provide data to inform the course revision process. Following the revision, we’ll revisit the analysis and see how the changes improve the outcomes for the students.
The kind of analysis we performed for the Center for Academic Excellence, is an example of what LIT can offer to individual instructors or groups. Depending on the goals and the data available, we can customize the analysis to fit the needs. If you are interested in having your course(s) analyzed, contact lit@wit.edu to start the discussion.
Reference:
Hattie, J., & Timperley, H. (2007). The Power of Feedback. Review of Educational Research, 77(1), 81–112. https://doi.org/10.3102/003465430298487