Reframing Assessment

14 Student Lecture Viewing: Learning from an Online Health Psychology Minor

Thomas Brothen, Penny Nichol and Esther Joy Steenlage Maruani

Keywords

online lectures, lecture capture, lecture viewing, learning outcomes, health psychology

Introduction

The availability of and registration in online courses is expanding. Basic Internet searches using terms such as online courses, online colleges, or online degrees yield millions of hits, suggesting that interest in them is also very high. Drouin (2013) reviewed literature on this issue and concluded that the increase is real and significant. And although the popularity of the MOOC phenomenon may have peaked, Hill (2016) credited that online course structure with helping to validate online education as a useful resource. Hill also concluded from his research that many students prefer online courses and possibly half of the students enrolled in them would not take a traditional, live course for a number of reasons. This finding suggests that post-secondary educational institutions need to develop online courses to serve their current and prospective student populations. Although the issue is not settled (c.f., Figlio, Rush, & Lin, 2010), general support for adding online courses came in a 26 June 2009 U.S. Department of Education meta-analysis of studies comparing online and traditional courses. The researchers found that “on average, students in online learning conditions performed better than those receiving face-to-face instruction.” Consistent with the findings reviewed above, the report also noted that the prevalence of online courses has been increasing greatly.

As the popularity of online courses continues to increase and colleges see them as a way to attract and maintain student enrollments, instructors are faced with choices as to how to structure them. Because the lecture method is still the dominant feature of college instruction, instructors designing online courses are likely to include recorded lectures even though research supporting lectures’ effectiveness is mixed at best (Costin 1972; Freeman et al., 2014; McKeachie & Hofer, 2002). Additionally, alternatives to lecturing such as the Personalized System of Instruction (PSI; Keller, 1968) have been shown to be consistently superior to traditional lecture teaching methods (Kulik, Kulik, & Bangert-Drowns, 1990). Further, mastery learning systems also have led to superior student learning (Bloom, 1976; Kulik et al.) compared to the traditional lecture method. There also has been a more recent movement in the direction away from lecturing toward cooperative learning and peer discussion (Johnson & Johnson, 2009; Mazur, 2009). Nonetheless, lecture remains the dominant instructional technique in colleges and this chapter reviews our evaluation of that method in three online courses that are part of an online minor in health psychology.

Our primary question is how much educational value lectures added to students’ learning outcomes in our courses. Our second approach was to determine how lectures compared to other course activities in their effect on students’ learning. To answer that question we first needed to determine more precisely how much of the lectures available to them students actually viewed. Most studies of this assess the answer to that question by basically asking students if they watched the lectures (e.g., Bosshardt & Chiang, 2016). We assessed that variable more objectively. We performed this study in three courses situated within a new program in our Department.
Our primary question is how much educational value lectures added to students’ learning outcomes in our courses. Our second approach was to determine how lectures compared to other course activities in their effect on students’ learning.
The Department of Psychology offers two majors (BA and BS) in addition to a minor in psychology. The Department has the largest undergraduate program in the liberal arts college with nearly 1500 majors in BA and BS programs and an additional number of students pursuing minors. Several years ago, difficulty arose in meeting student demand for core courses in the curriculum required of students pursuing both majors and minors. Consequently, the Department requested to drop the minor because of completion for these core courses from students pursuing the minor. After pushback from the College administration and the College Assembly, the Department reaffirmed the minor and attempted to meet student demand by initiating an online minor in 2012. The minor created at that time consists of several online courses and focuses on the theme of Health Psychology.

The minor in Health Psychology includes six courses, each of which is taught in a fully online version that duplicates the content and course goals of already existing live courses. This chapter presents data from an analysis of three of these courses: Introduction to Psychology, Introduction to Psychological Measurement and Data Analysis, and Introduction to Research Methods. Our primary goals in the research described here are to determine first how much students actually watched lectures in our online courses and then whether the lectures add educational value. Our overall goal is to provide a perspective on how much students value online lectures and thus help instructors to decide whether they should use them in their online courses. To accomplish these goals, we set out to gather data that is more valid than the questionnaire data typically gathered by researchers to assess amount of online lecture watching and to relate that data to data on other course elements.

Method

For each of the three courses described here, we recorded lectures with the Mediasite lecture capture system (http://www.sonicfoundry.com) and made them available to students as links through the Moodle course management system (CMS). The Mediasite platform is a lecture capture system that is within “a subset of streaming products… designed specifically for capture and management of classroom content. These products rose from the…need for educational institutions to record and archive content” (http://www.sonicfoundry.com/resource/wainhousereport/?aliId=62648363). Students watch the lectures on one window on their screens and the Powerpoint slides through a second window. They can stop at any time during the lecture and when they come back, they are brought back to the point where they stopped previously. The built-in analytics track students’ usage and “shows a specific user’s (or group of users’) viewership over any time period, including videos watched, viewing activity, durations, registration data and more” (http://www.sonicfoundry.com/mediasite/manage/analytics/). Mediasite keeps a record of how much of each lecture each student viewed on one or more occasions, adding only the unviewed sections into the calculation. For this study, we obtained the total percentage watched for each lecture by all students and computed an average percentage watched (from 0 to 100%) for all lectures in each of the courses. Thus, each student had a value of 0-100% watched for each lecture and for the semester as well. We also collected the calculated percentages watched for each student on each lecture and then computed a Total Lecture Viewing average that also ranged from 0 to 100% for all the lectures.

PSY 3801: Introduction to Psychological Measurement and Data Analysis

The Introduction to Psychological Measurement and Data Analysis course (PSY 3801) is offered in two formats taught by different instructors – a traditional, in-person lecture and laboratory format and a completely online format. In the semester during which this study was conducted, the traditional format began with 360 students, who attended 50-minute lectures three times a week and a 75-minute laboratory session once a week. Forty-six students started the online version of the course. Those students watched all lectures online and participated in online laboratory work. The lecture modules (i.e., short videos that focused on a specific topic, subtopic, or computational example) were recorded specifically for the online section in a studio using Mediasite lecture capture. Overall, 98 modules were available for viewing, running roughly 3 minutes to 30 minutes in length, with an average module length of 13.22 minutes (SD = 6.40). Approximately 11% of the modules focused on review material or introductions to topics, 49% covered new course content, and 40% consisted of computational examples of the statistical tests.

Students in the online section completed:

  • 10 problem set assignments,
  • 13 chapter quizzes,
  • two midterm exams, and
  • a final exam
  • Students also could earn extra credit by completing additional assignments related to course work

For this chapter, assessment information examined included:

  • students’ chapter quiz score;
  • total chapter quiz score;
  • each individual exam score, including a total exam score;
  • extra credit score; and
  • total points for the course

PSY 3001W: Introduction to Research Methods

The Introduction to Research Methods course (PSY 3001W) is a writing intensive course offered in two formats taught by different instructors – a traditional, in-person lecture and laboratory format and a completely online format. The traditional format offered during the semester included in this study began with 288 students who attended 50-minute lectures two times a week and a 105-minute laboratory session once a week. Twenty-four students began the semester in the online course. These online students watched all lectures online and participated in online laboratory work, including a group research project. As with the data analysis course, online lectures were broken down into modules by topic or subtopic. For this course, 111 modules were available for viewing, running roughly 2 minutes to 29 minutes in length, with an average module length of 11.10 minutes (SD = 5.95). Eight percent of the modules focused on review material or introductions to topics, 76% covered research methods content, and 26% consisted of writing tips and APA style guidelines.

The online students completed:

  • writing assignments mostly related to an APA-style research paper,
  • weekly lab participation assignments,
  • weekly quizzes
  • two midterm exams, and
  • a final exam
  • online students also could earn extra credit by completing additional assignments related to course work

For this chapter, assessment information examined:

  • students’ paper-related writing score total;
  • overall writing score total; total quiz score;
  • total participation score;
  • each individual exam score, including a total exam score
  • extra credit score; and
  • total points for the course

PSY 1001: Introduction to Psychology

The Introduction to Psychology course (PSY 1001) is offered each semester in a live/hybrid/online format with students being able to select from three basically different formats. Although the formats are different, they cover the exact same material with all students using the same textbook, completing basically the same assignments, taking the same three mid-semester exams and final exam in a computerized testing center. All students attended or watched all the same 39 fifty minute lectures. For the semester in which this analysis was performed, a total of 1207 students began the course. In the first variation of the course, 582 students finished by completing the work and taking the final exam. They attended live one hour lectures three days/week and a live one hour discussion section one day/week. In the second hybrid live/online version, students watched the lectures online and attended a live one hour discussion section one day/week – 369 finished the course. In the third totally online version, students watched the lectures online and participated in an online, asynchronous discussion section each week – 125 finished. Finally, 68 students from the University Honors Program enrolled in a variant of the hybrid version in which they watched the lectures online and attended a live two hour discussion section each week in which the regular discussion activities were handled in more depth. Because of the higher academic proficiency and status of the Honors students, that section was eliminated from all subsequent analyses.

For the introductory psychology class, we collected Mediasite lecture viewing data for each of 39 fifty-minute class lectures that were given live in class, then recorded, and made available to all students within 5 minutes of each lecture’s conclusion. In addition to the total percentage lecture viewing variable, we collected two other input variables that we predicted would contribute to course performance: discussion activity performance, measured by total points obtained, and total points obtained on practice exams. Students had practice exams available for each of three mid-semester exams and the final exam. The purpose of the practice exam was for students to assess their knowledge and get direction as to what they should restudy. Students could repeat them as many times as they liked. The practice exam items are drawn randomly from large item pools that measure what also is measured on the exams. In this and previous semesters, the correlations between points gained on exams and points on the practice exams has been consistently in the +.70 correlation range. We also collected two student academic measures from the Registrar’s office – cumulative college GPA and ACT Comprehensive exam score. Finally, to determine if personality type moderated any of the other variables, we recorded students’ scores on a Big5 Personality Scale (DeYoung, Quilty, & Peterson, 2007) that we had administered online as part of a class assignment. All data collection for this study was consistent with our University Human Subjects protocols.

We collected data from two versions of the introduction to psychology course that utilize online lectures. Students in the totally online section completed weekly graded asynchronous discussion assignments worth approximately 17% of their grade, and took 3 mid-semester and 1 final exam worth 51% of their grade. Students were told that exams were structured around the lectures and some course material could only be gained through them. To help them prepare for exams, students had practice exams covering the same concepts tested on the exams but with different items drawn randomly from large pools so they never got the same exam twice. Students in the hybrid section we investigated had live discussion sections once each week and operated under the same grading process. An ANOVA comparing total exam points between the live, hybrid, and online sections showed they did not differ (F(2, 1049) = 1.04, n.s.). Because this and subsequent analyses determined that the same pattern of results occurred in the hybrid and online variations, we combined the data to obtain more statistical power for the final analyses we performed for this study.

Results

PSY 3801: Introduction to Psychological Measurement and Data Analysis

For the measurement and data analysis course, 40 students completed the course. Their percentage of total lecture watching averaged 28.28% for the semester. The average percentage of lecture module viewing was correlated with total quiz score, total problem set score, individual exam scores, total exam score, extra credit score, and overall total points in the course, respectively. The relationships between average percent viewing and total quiz score (r(38) = .175, n.s.), total problem set score (r(37) = -.131, n.s.), midterm exam 1 score (r(38) = .142, n.s.), midterm exam 2 score (r(38) = .257, n.s.), final exam score (r(38) = .215, n.s.), total exam score (r(38) = .235, n.s.), and extra credit score (r(38) = .261, n.s.) were all nonsignificant. However, the relationship between average percent viewing and overall total points in the course was significant, r(38) = .322, p < .05. Throughout the course of the semester, the average percent of each module watched decreased from a high of 74% for the first lecture module to a low of 5% for the last lecture module. There was a significant decrease in average lecture viewing when comparing the first week to the last week of the course (t(16) = 9.40, p < .001).

PSY 3001W: Introduction to Research Methods

Twenty-three students completed the research methods course. Their percentage of total lecture watching averaged 34.87% for the semester. Almost all of the course assessments were not significantly related to the average percentage of lecture module viewing. For writing, the paper components (r(21) = .084, n.s.) and total writing score (r(21) = .086, n.s.) showed virtually no correlation with total viewing; the correlation between participation and viewing (r(21) = .083, n.s.) was roughly the same as that of the writing components. The correlations between total quiz scores and viewing and between midterm exam 2 score and viewing were the same (r(21) = .325, n.s.). The correlation between extra credit score and average viewing was negative (r(21) = -.204, n.s.). Unlike the measurement and data analysis course, total course points and average viewing were not significantly correlated (r(21) = .135, n.s.), while midterm Exam 1 scores (r(21) = .434, p < .05) and final exam scores (r(21) = .466, p < .05) were significantly correlated with lecture viewing. A pattern in the average percentage of each module watched similar to that of the measurement and data analysis course was observed with a higher percentage having watched near the beginning (75% for the first module of week 2) and a lower percentage at the end (15% for the second to last module) of the course. Again, there was a significant decrease in average lecture module viewing when comparing the first week to the last week of the course (t(14) = 11.13, p < .001). One key difference was a spike in lecture module viewing after the first exam (75% for Introduction Formatting Guidelines), but this might have been due to the nature of the module – the formatting guidelines for the students’ first major writing assignment. Additionally, average lecture watching for the research methods course was higher throughout the course of the semester than lecture watching for the measurement and data analysis course.

PSY 1001: Introduction to Psychology

The results from the data analysis and research methods courses were replicated and extended in the introductory psychology course. We first examined the percentage of lectures watched by students. As described above, we obtained the Mediasite calculated percentages for each student on each lecture and also computed a Total Lecture Viewing average that ranged from 0 to 100%. Students watched an average over all 39 recorded lectures of 37.38%. The average watched declined throughout the semester. The average percentage for the first three lectures was 44% and for the last three, 36%. The linear trend for the decline in all 39 lectures was significant with F(1, 493) = 38.59 (p < .001). Clearly, lecture watching was much less than 100% and declined as the semester proceeded. We next examined the relationships between the six variables listed above.

Correlating the six variables (lecture viewing, discussion activity points, exam points, practice exam points, cumulative GPA, ACT Comp score) revealed a value of r(494) = .421, p < .001 between Total Lecture Viewing and Exam Total. This suggested a positive effect of viewing lectures and exam performance. We then examined the relationship of the other variables with Total Lecture Viewing: Participation Points Total correlated r(494) = .261, p < .001, Practice Exam Total r(494) = .217, p < .001, cumulative GPA r(494) = .392, p < .001, and ACT Comprehensive score r(494) = .097, p < .05. To assess how much the other variables were associated with exam total, we next assessed the correlations of Exam Total with Participation Points Total r(494) = .455, p < .001, Practice Exam Total r(494) = .630, p < .001, cumulative GPA r(494) = .706, p < .001, and ACT Comprehensive score r(494) =.375, p < .001. Personality variables measured by the Big5 scale (Openness to New Experience, Conscientiousness, Extraversion, Agreeableness, and Neuroticism) did not correlate with Total Lecture Viewing and thus were dropped from further analysis.

For the next analysis, we defined Exam Total as the outcome variable and the others (lecture viewing, discussion activity points, exam points, practice exam points, cumulative GPA, ACT Comp score) as predictor variables to assess how much lecture viewing affected students’ learning performance. To control for dependencies between the predictor variables on the criterion variable, we performed a stepwise linear regression between the outcome and predictor variables. The results indicated the overall R = .826 (p < .001). This analysis accounted for a very substantial 68.2% of the total variance with four of the five predictor variables accounting for significant (p < .05) variance added. The first variable in the equation was Cumulative GPA which accounted for 45.8% (p < .001) of the variance. Second was Practice Exam Total, which accounted for 14.2% (p < .001) additional variance, the third was ACT Comprehensive score (7.6%; p < .001), and last was Total Lecture Viewing (.7%; p < .05). Participation points was not a significant predictor in the analysis.

Summary

Across all three classes, lecture viewing was low (well less than 40%, on average). Viewing also declined significantly during the semester for all three classes. Overall, the positive effects of lecture viewing were small and not statistically significant in most cases. In the introductory psychology course, our data revealed that lecture viewing had some impact on exam performance but other activities had more.

Discussion

Gysbers, Johnston, Hancock, and Denyer (2011) asked Australian Biochemistry and Molecular Biology students who had access to online lectures why they still came to live lectures. They found that students believed live lectures provided a better explanation of the material and provided a better social environment or simply attended lectures because that seemed the right thing to do (see also Jensen, 2011). This illustrates that lectures are traditionally favored by both instructors and students but this begs the question of how useful they really are. This study explicitly addressed that question.
The low percentages of lecture viewing suggest that students actually value watching lectures much less than they say they do, and that their central place in courses would suggest.
The three courses represented in this study had hybrid or online versions that had to meet the same Departmental requirements for content learning. If the live course covered material in lectures, it also had to be covered in the hybrid or online versions. This requirement for covering all the course material is handled differently in the three courses—from specially created modules in the Data and Research courses to video copies of the live lectures in the Introduction course. But our data reveals very similar results for them. In the introductory psychology course, the measurement and data analysis course, and the research methods course, percentage of viewing averaged only 37.5%, 28%, and 35%, respectively, and in all three courses, lecture viewing declined throughout the semester. Parenthetically, in a past semester’s live lecture version of the introductory psychology course, lecture attendance also declined throughout the semester to a low of about 40% at the end of the semester (Wu, 2015). The low percentages of lecture viewing suggest that students actually value watching lectures much less than they say they do, and that their central place in courses would suggest. We return to our rather narrow original question as to how much value students apparently see in watching online lectures and how much lecture viewing adds to student learning.

Studies such as that done by Bosshardt and Chiang (2016) have utilized mostly quasi-experimental manipulations of live lecture vs online lecture course sections and found no effects or data favoring one over the other. Such studies have not provided a good answer as to baseline lecture viewing by students. This study looked more carefully at actual student behavior as tracked by the Mediasite platform and we found that percentage of the lectures watched was below 40% for the introductory psychology course for most of the semester. For the measurement and data analysis course, average lecture watching was higher for the first half of the semester (48%) than the second half of the semester (21%). This trend also held in the research methods course with an average of 51% lecture viewing in the first half of the semester and 26% in the second half of the semester.
We also believe that if there truly is benefit to students from lectures, instructors should think of them as study aids that some students might find useful rather than making them the course centerpiece.
Although lecture viewing had a positive effect on student learning based on exam points, that effect was very small—less than 1% of variance accounted for in total exam points for the introductory psychology course. The exams in the introductory psychology and research methods courses were primarily lecture-based, rather than textbook-based and lecture viewing correlated significantly with most of the exam-related components of these two courses. Clearly, students were getting that information from somewhere other than from watching the lectures. We suspect that many students used resources such as the textbook or internet sites related to the material or from other students’ lecture notes. We see this as a subject for further research.

Most who have taught college courses have felt the pressure to include lectures. They either know or soon find out that students expect lectures and get anxious if they are not included as a major part of the course. New and different forms of instruction that do not include lecturing are undertaken at the instructor’s peril. The first author of this chapter has in his files a comment by a student made on a course evaluation in a course taught by the learning group method in a computer lab. The student wrote “All the instructor does in this class is walk around while the students do all the work.” For this type of reason and many others, lectures have remained central to college teaching. Even MOOCs, the “newest” wrinkle in the higher education fabric, originally were almost entirely recorded lectures although they have lately become more diverse in their approach (Ossiannilsson, Altinay, & Altinay, 2016).

The data obtained in this study suggests that activities such as problem sets and practice exams might be much more useful for student learning. Students occasionally remark during evaluations of our courses that they would rather learn from the textbook or find the textbook more valuable than lectures even though we tell them that our exams draw from the lectures. Based on our findings in this study, we have little concrete advice for instructors as to how they should structure their lectures but we do advise them to consult the large literature on increasing student learning (e.g., McKeachie & Hofer, 2002) to find ways to improve their students’ learning outcomes. We also believe that if there truly is benefit to students from lectures, instructors should think of them as study aids that some students might find useful rather than making them the course centerpiece.

References

Bloom, B. S. (1976). Human characteristics and school learning. New York: McGraw-Hill.

Bosshardt, W., & Chiang, E. P. (2016). Targeting teaching lecture capture learning: Do students perform better compared to face-to-face classes? Southern Economic Journal, 82, 1021–1038. doi:10.1002/soej.12084

Costin, F. (1972). Lecturing versus other methods of teaching: A review of research. British Journal of Educational Technology, 3(1), 4-31. doi:10.1111/j.1467-8535.1972.tb00570.x

DeYoung, C. G. , Quilty, L. C., & Peterson, J. B. (2007). Between facets and domains: 10 aspects of the Big Five. Journal of Personality and Social Psychology, 93, 880 – 896. doi: 10.1037/0022-3514.93.5.880

Drouin, M. A. (2013). If you record it, some won’t come: Using lecture capture in introductory psychology. Teaching of Psychology, 41(1), 11-19. doi:10.1177/0098628313514172

Figlio, D., Rush, M., & Yin, L. (2010). Is it live or is it internet? Experimental estimates of the effects of online instruction on student learning. doi:10.3386/w16089

Freeman, S., Eddy, S. L., Mcdonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410-8415. doi:10.1073/pnas.1319030111

Gysbers, V., Johnston, J., Hancock, D., & Denyer, G. (2011). Why do students still bother coming to lectures, when everything is available online? International Journal of Innovation in Science and Mathematics Education, 19(3), 20-36.

Hill, B. P. (2016). MOOCs Are Dead. Long Live Online Higher Education. The Chronicle of Higher Education

Jensen, S. A. (2011). In-class versus online video lectures: Similar learning outcomes, but a preference for in-class. Teaching of Psychology, 38, 298-302. doi:10.1177/0098628311421336

Johnson, D. W., & Johnson, R. T. (2009). An educational psychology success story: Social interdependence theory and cooperative learning. Educational Researcher, 38, 365-379. doi:10.3102/0013189×09339057

Keller, F. S. (1968). “Good-bye, teacher…” Journal of Applied Behavior Analysis, 1, 79–89.http://doi.org/10.1901/jaba.1968.1-79

Kulik, C. C., Kulik, J. A., & Bangert-Drowns, R. L. (1990). Effectiveness of mastery learning programs: A meta-analysis. Review of Educational Research, 60, 265. doi:10.2307/1170612

Mazur, E. (2009, January). Farewell, lecture? Science, 323(5910), 50-51. Retrieved from http://www.jstor.org/stable/20177113

McKeachie, W. J., & Hofer, B. K. (2002). McKeachie’s Teaching Tips: Strategies, Research, and Theory for College and University Teachers (11th ed.). Boston, MA: Houghton Mifflin Company.

Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of evidence based practices in online learning: A meta analysis and review of online learning studies. Washington D.C.: U.S. Department of Education Office of Planning, Evaluation, and Policy Development.

Ossiannilsson, E., Altinay, F., & Altinay, Z. (2016). MOOCs as change agents to boost innovation in higher education learning arenas. Education Sciences, 6(3), 25. doi:10.3390/educsci6030025

Wu, W. (2015). Correlational relationship between class attendance and class performance in college. Unpublished manuscript, Department of Psychology, University of Minnesota.

Correspondence regarding this manuscript should be directed to Thomas Brothen, Department of Psychology, University of Minnesota, Twin Cities. This chapter was supported in part by National Science Foundation Grant NSF/IIS-1447788 to the first author.