Reframing Assessment

15 Under the Watchful Eye of Online Proctoring

Daniel Woldeab, Thomas Lindsay and Thomas Brothen

Keywords

online, proctoring, student, exams, academic integrity

 

Introduction

Over the last two decades the steady increase of online learning in public higher education institutions has brought about the challenge of how best to assess students’ progress online while safeguarding the integrity of exams. To address this challenge, many institutions have outsourced the proctoring of student exams to online proctoring providers, which provides students with a way of taking exams from a distance through a secure platform. Students need to have a working computer with functioning webcam, and access to internet. The online proctoring service monitors students while they are taking the exam, and allows students to take the exam from any location Maintaining honesty in the learning environment becomes even more significant, as one of the fundamental aspects of instruction is the assessment of student learning. The purpose of this study therefore, is to examine student and faculty satisfaction with a solution to that problem–online proctoring.they choose: from their own home and supervised online, to a testing center where supervision is provided by live proctors. This study used ProctorU as the proctoring service provider. Students in fully online courses took their exams online, and were thus supervised remotely. Students in traditional classroom, or in hybrid courses (a mix of online and face-to-face) took their exams in a testing center, where they were supervised by in person proctors.

It is well documented that there is an increasing trend of U.S. public universities and colleges offering online courses. Students appreciate the convenience that online learning gives them, while institutions can use these environments to broaden their reach, expand and diversify their offerings. Allen and Seaman (2010) reported that 30 percent of U.S. college and university students are enrolled in at least one online course. The authors also reported that online enrollments in higher education in the U.S. grew at a much faster rate than overall enrollments in traditional university classes (Allen & Seaman, 2010).

For the most part, the students who are enrolled in online courses today are not the conventional distance learners – i.e., adult learners mostly living and working far from a university campus. Geographical distance has very little to do with today’s online learners; the majority are students who live on or around university campuses. Mann and Henneberry (2012) noted that many of the students that are taking online courses are of more traditional college age, between 18 and 24. For the most part, these students gravitate toward online offerings because the format fits their work and other obligations; without this convenience it would be very difficult for them to make the progress needed in their programs, and hence to complete and graduate on time. To meet this demand for flexibility, many institutions offer a considerable number of online courses (Allen & Seaman, 2011).

However, as these institutions continue to grow their online offerings, educators are increasingly concerned about how best to ensure academic integrity (Barnes & Paris, 2013). Maintaining honesty in the learning environment becomes even more significant, as one of the fundamental aspects of instruction is the assessment of student learning. The purpose of this study therefore, is to examine student and faculty satisfaction with a solution to that problem–online proctoring.

Review of Related Literature

In the last two decades, online education has made room for many private startup online colleges and universities throughout the United States, and a growing number of American traditional public higher education institutions have also entered into providing online education. These institutions face questions of how to navigate this change, and whether they can meet the needs of a contemporary student body that seeks flexible ways to attain their educational goals. At the same time, they struggle with ever-shrinking local and federal budgets, which may push them to expand their online education offerings as a way to grow budgets and shrink costs.

Online education seems poised to become an educational norm in the years to come. A 2011 study by Ambient Insight Research noted that at the time of the study 1.25 million higher education students took all of their courses online, and 10.65 million students took some of their courses online (Adkins, 2011). Indeed, technology can greatly enhance the learning experience – bringing concepts and curricula to life in new ways. Likewise, when used effectively, online opportunities have provided higher education institutions with flexible options to expand their offerings into the global market (Casey, 2008). Harden (2013) also argues that the college classroom itself will in part become virtual, due to the advancements in information technology, the continued tuition increases that outpace inflation, and the country’s massive student loan debt.

Much of the literature shows that the shift across higher education is forward-looking – toward online instruction. If this becomes the teaching platform of the future “as the greater interactivity and global connectivity that future technology will afford, the gap between the online experience and the in-person experience will continue to close” (Harden, 2013, p. 56). Allen and Seaman (2015) of the Babson Survey Research Group state that online learning growth accounted for nearly three-quarters of all U.S. higher education enrollment increases in 2014.

More recently, Carey (2015) – after going back centuries, scrutinizing the model upon which America’s higher education is based, and that he considers flawed – stresses that information technology is capable of providing quality and affordable forms of higher education. Equally, DeMillo (2011) contends that the new information technology forces, which are moving higher education toward more virtual spaces, will challenge institutions clinging to centuries-old models of higher education.

As public higher education institutions gradually embrace online education, maintaining academic integrity in cyberspace has added another challenging dimension, which is of great concern to educators and institutions alike. In one of the most comprehensive studies looking into academic dishonesty, Bill Bowers (1964) looked at 99 U.S. universities and colleges comprising 5,000 students and revealed that 75 percent of institutions surveyed had one or more incidents of academic dishonesty. This trend has been increasing: a study conducted by McCabe, Trevino, and Butterfield (2001) shows that both the magnitude and severity of academic dishonesty on college campuses have greatly increased in the last decades. Likewise, a study conducted throughout the United States and Canada by McCabe (2005) and encompassing 80,000 students from 83 different colleges and universities, found that “one in five students (21%) has engaged in at least one …. serious form of test or exam cheating” (p.3). Even more dramatically, Gabriel (2010) reported that “in surveys of 14,000 undergraduates over the last four years, an average of 61 percent admitted to cheating on assignments and exams” (p.2).

There are relatively few articles written about academic dishonesty regarding unproctored online exams (e.g., Ercegovac & Richardson, 2004; Strengold, 2004); and comparisons of academic dishonesty in online and traditional bricks and mortar environments have also been explored (e.g., Grijalva, Nowell, & Kerkvliet, 2006; Shaw, 2004). However, our understanding about faculty and student satisfaction with online proctoring is very much unknown.
…the literature is inconsistent when it comes to examining the severity and magnitude of academic dishonesty taking place online versus in a traditional setting.
While studies done by Lanier (2006) found higher rates of cheating online, (Harmon & Lambrinos, 2008; Grijalva, Nowell, & Kerkvliet 2006), others such as Hart and Morgan (2010) and Stuber-McEwen, Wisely, and Hoggatt (2009) found lower rates of academic dishonesty online compared to those in traditional settings.

Others have looked at performance levels on proctored and unproctored exams as indicators of academic dishonesty. For example, a Schultz, Schultz, and Gallogly (2007) study comparing unproctored online exams and proctored paper and pencil exams of the same course reported that those in unproctored online exams performed significantly higher than those in the proctored setting. Similarly, the study of Carstairs and Myors (2009) confirmed the above findings that students who took their exams with an unproctored method performed significantly higher than those who took proctored exams. Other studies took the general approach of arguing that since most online exams are unproctored, it is difficult to verify exam takers’ identity and deter cheating (Reynolds and Weiner 2009).

Further, those taking online unproctored exams may have easier access to prohibited information during the exam (Reynolds & Weiner, 2009). However, the limited literature that exists seems to indicate that regardless of where the exam is taking place (in an online or brick-and-mortar setting), academic dishonesty occurs. Indeed, factors contributing to academic dishonesty in the traditional classroom are also present in the online teaching and learning environment, and educators’ concerns are well warranted.

It is well documented that online proctoring offers both faculty and students considerable advantages. Kinney (2001) noted that online proctoring is a valuable option for students who are geographically dispersed, and where it is not feasible for them to take their exams while on campus. Likewise, Tao and Li (2012) stated that when online proctoring is used to assess students attending conventional brick and mortar classes, online proctoring reduces instructional time dedicated to testing. This allows educators and students to engage more with the course contents. Furthermore, Naglieri, Drasgow, Schmit, Handler, Prifitera, Margolis, and Velasquez (2004) asserted that online exams are more scalable and efficient than pencil-and-paper exams. The next section, on methodology, discusses the characteristics of research participants and scale we used to assess student and faculty satisfaction with online proctoring.

Methodology

To understand students’ decision to participate in, and their satisfaction with classes that use online proctoring of exams, we assessed three courses in a single department over two consecutive semesters: two upper-level courses that were conducted entirely online and one large introductory course with online, face-to-face, and hybrid sections. Instructors agreed to participate in a focus group about their experiences, and to offer students extra course credit for participation. A total of
865 students consented to participate in the study: 339 in fully-online courses with online-proctored exams, 357 in traditional lecture courses with exams in a testing center, and 169 in hybrid courses with a mix of in-person and online course delivery, and exams in a testing center. Students taking their exams through the online proctoring service first were to check to insure their equipment met the service’s requirements of a webcam and microphone. Next, they established an account with the company and scheduled their exams during the exam time window established for the class. When they went to the proctoring site for their exams, a proctor checked to see they were in a reasonably private testing location, administered an identification protocol, and then entered the password into the Moodle exam. For the duration of the exam, live proctors monitored the students and the exam was video recorded in case there were any possible security breaches.

Table 1: Participant details

Number of students in fully online courses Number of students in face-to-face classes Number of students in hybrid courses
Consented to participate in study 339 357 169
Completed the post-survey 316 351 169
Completed both pre- and post-surveys 114 264 126

We assessed three courses in a single department over two consecutive semesters: two upper-level courses that were conducted entirely online and one large introductory course with online, face-to-face, and hybrid sections.We asked students to consent to participate in the study and to share their course grades with the researchers. Consenting participants completed a pre-survey near the beginning of the semester, before any exams had been given, as well as a post-survey immediately following the final exam. All procedures were approved by the Institutional Review Board. The 836 (316 online, 351 face-to-face, 169 hybrid) included in this study are the total number who both consented and completed the final post-survey. For comparisons that require both the pre-survey and the final data, the number reduces to 504 students (114, 264, 126).

Both surveys followed the same structure, divided into five general topical areas: scheduling, format, technology readiness, questions about the exam, and the experience of being monitored. The pre-survey asked about previous experiences, reasons for choosing the specific class and its exam format, and expectations for online-proctored examination. The post-survey followed up on the specific experiences of the exam they had just completed, how it aligned with their expectations, and their considerations for future online-proctored exams. The course instructors’ role in the research study was to help gain student participation and also serve as research participants themselves. We asked instructors to keep notes about questions and concerns they might observe during the semester. They also participated in a focus group at the end of the first semester to discuss their views and experiences with online proctoring, insights into student experiences, and any effects of online proctoring on their classes.

Findings

Pre-Survey
In this chapter, we report only results for those students taking their exams through the online proctoring service. Of the 114 online-proctoring students who took the pre-survey, 81% reported that they had previously taken exams online or on a computer, and the same percentage claimed no concerns about the technology they would need for exams. Despite this, more than one third (38%) reported being somewhat or not at all confident that they had the necessary equipment, and a majority (52%) reported being somewhat or not at all confident that they had the expertise to set up, use, and/or navigate any technological aspects of the exam environment (see table 2a). Only 40% of students reported being comfortable or very comfortable taking exams in the course’s testing environment, with 44% being somewhat comfortable and 16% selecting “not at all comfortable” (see table 2b).

Table 2a: Pre-Survey Confidence in Online Proctoring Technology

Not at all or somewhat confident Confident or very confident
Have all equipment needed 38% 62%
Have expertise to set up, use, and/or navigate 52% 48%

Table 2b: Pre-Survey Comfort with Online Proctoring Environment

Not at all or somewhat comfortable Comfortable or very comfortable
Comfort with taking exams in course’s testing environment 60% 40%
Comfort with presence of proctor 70% 30%

Half (50%) of online-proctoring students reported that the format of the class (online lectures and discussions, with proctored online exams) was their first choice of course format. In open-ended responses participants cited several reasons why the format was or was not their first choice, but even the most common themes—convenience of scheduling, preference for face-to-face conversation—were only mentioned by a small number of participants. Scheduling was cited by some as an advantage of online proctoring, but others viewed it as a disadvantage: while only 24% indicated concerns about scheduling exams for the course, participants were almost evenly split on how well scheduling options would work for them, with 53% responding that options work “extremely well” or “well”, and 47% responding that options work only somewhat or do not work. Similarly, when asked if they expected exam scheduling to be easier or harder than other courses they had taken, a third (33%) expected it to be about the same, one quarter (25%) expected it to be harder, and 40% expected it to be easier.

In a series of questions about their expectations of being monitored during the exams, students expressed a significant level of discomfort with online proctoring. Of respondents, 70% expressed that they were not at all or somewhat comfortable, while only 30% indicated they were either comfortable or very comfortable with the presence of proctors during their online exams (see table 2b). While a majority (56%) expected the overall level of monitoring to be “about right,” fully 42% thought it would be “too much”, and very few participants (3%) thought it would be “too little”. Students were not concerned about the proctors distracting them from their exams, however: a majority (59%) indicated that in prior exams, proctors were never distracting, and a third (33%) selected “somewhat or sometimes distracting”, while just 9% stated that proctors were often or always distracting.

While students did not expect to ask many questions of the proctors, the ability to do so was considered important to a substantial number. Just a small minority of students expected it to be likely that they would ask either procedural or content questions of the proctors, with the overwhelming majority on both topics considering it only somewhat or not at all likely (see table 3). About a quarter of students (26%) expected the proctors to be helpful or very helpful, and a similar number (24%) expressed that proctors for previous exams had been usually or always helpful (see table 4). Despite students’ relatively low expectation of their likelihood to ask a question or of the proctor’s helpfulness in addressing it, over a third (35%) of participants indicated that the ability to ask these sorts of questions was important or very important.

Table 3: Questions Asked of Proctors

Pre-Test Survey Reported Likelihood Post-Survey Incidence of Asking questions
Likely or Very Likely Somewhat Likely Not at all Likely Question Asked Answered Satisfactorily
Procedural Question 18% 51% 31% 9% 71%
Content Questions 10% 50% 40% 4%

Table 4: Expectations and Experiences of Proctors’ Helpfulness

Not at all or somewhat helpful Helpful or very helpful
Pre-survey expectation 74% 26%
Pre-survey prior experiences 76% 24%
Post-survey experience 64% 36%

Post-Survey
With the background of expectations laid out, the post-survey (316 online-proctoring student participants) was gathered after the final exam for the classes being studied, and demonstrates some of the complexities, successes, and challenges of online exam proctoring. While the majority of students (63%) reported no problems with the testing environment, a sizeable minority (37%) reported a variety of problems. Most of these problems were minor issues with setup or scheduling, but a small number of significant problems were reported. A similar number found the test environment (e.g., trouble with scheduling, or with connecting to the test) to be “somewhat conducive” (32%) or “not at all conducive” (6%) to the students’ test-taking.

Scheduling and unexpected wait times appear to be significant challenges for students using online proctoring. Student wait times after scheduled exam start times were approximately evenly distributed across “less than five minutes” (19%), “five to ten minutes” (27%), “ten to fifteen minutes” (20%), “fifteen to twenty minutes” (16%), and “more than twenty minutes” (18%). Half of students who waited at least five minutes (51%) thought the wait was acceptable, but nearly two-thirds of that same group (65%) found that the wait induced stress.

On the other hand, the benefits of flexible scheduling were significant to many online-proctoring students. An eighth of participants (13%) were outside the metropolitan area in which the University is located at the time they took their final exam, and 3% report being outside the United States. While a small majority (53%) of students reported that they could have easily taken the exam in person, 34% indicated that this would have been more trouble, and 13% reported that they could not have taken the exam at all if they had needed to travel to the University to take it (see table 5a). Similarly, while a majority (56%) of respondents report that if the online proctoring tool were not available they would have had no trouble taking the exam, it would have been “somewhat more difficult” for 35% of students or “much more difficult” for 8% to take the exams; and 2% of respondents indicated that without online proctoring they would have been unable to complete the exams (see table 5b).

Table 5a: Ability to Take Exams in Person if Online Proctoring were not available

Easily able Able, but more trouble Not able
53% 34% 13%

Table 5b: Difficulty of Taking Exams by Any Other Method

No more difficult Somewhat more difficult Much more difficult Not possible
56% 35% 8% 2%

Unlike traditional, in-person proctored exams, online proctoring presented participants with both the benefit of being able to choose a time for their exam, as well as the inconvenience of needing to find a time that would work. The data indicate that the benefits far outweighed the drawbacks. Very few students (6%) reported a schedule that was “not at all flexible”, with the vast majority (72%) reporting their schedule for exams as “somewhat flexible” and about a fifth (22%) claiming to be “very flexible”. In line with this, specific scheduling of online-proctored exams worked well, with more than four-fifths of students (82%) reporting that the scheduled time worked “extremely well” (28%) or “reasonably well” (54%). Nearly all participants (96%) reported that they were able to take the exam in a time and place that worked for them.

The post-survey data indicate that some students became more comfortable with online proctoring, but that many students still had concerns. Whereas 70% had expressed that they were “not at all comfortable” or “somewhat comfortable” with the presence of proctors in the pre-survey, that number dropped slightly, to 63%, when reporting on their comfort during their final exam. Concern over the level of monitoring also improved, with 71% marking the level as “about right”, compared to 56% before the first exam. While 42% thought it would be “too much” before the first exam, only 26% reported that it had been “too much” during their final exam, and very few participants (3%) thought it was be “too little” (see table 6). Online proctors were rated by students as not especially distracting, and about as much so as proctors in other (presumably face-to-face) exams they had taken: a strong majority (62%, vs. 59% in the pre-survey) reported the proctors as “never or not at all distracting”, with over a quarter more (27%, vs. 33% in the pre-survey) selecting “somewhat or sometimes distracting”, for a total of 89% (vs. 92% in the pre-survey) indicating a low level of distraction.

Table 6: Level of Monitoring

Too Much About Right Too Little
Pre-survey expectation 42% 56% 3%
Post-survey experience 26% 71% 3%

Overall, even fewer students asked proctors procedural or content questions than they had anticipated (see table 3). A majority (71%) of students who asked questions reported that the proctor was able to answer their question(s) adequately. However, students reported that the proctors were not particularly helpful: a majority (63%) rated them not at all or somewhat helpful, while just a third (36%) rated them helpful or very helpful. This number, while low, still represented a substantial improvement over pre-survey expectations (see table 4).

After taking a course for a semester using online-proctored exams, students remained divided on the relative merits and tradeoffs of this exam format. A small majority (58%) indicated that they would choose to use this testing environment in future classes. However, this response appears to be closely linked to whether or not the online-proctored format was or was not their first choice of course format. A strong majority of those who had reported this at the beginning of the semester to be their first choice of format (75%) reported that they would choose this testing environment again in the future, while an almost equally strong majority of those who had reported it to not be their first choice (70%) reported that they would not choose this testing environment again in the future.

Faculty Focus Group
Two overall themes became apparent in the analysis of the instructor focus group. First, they expressed three reasons for giving proctored exams: to assess student retention of information and their ability to remember significant portions of the content and relay that information without notes and supplemental information; to more or less force students to keep up with content as the course is progressing; and to encourage students to not just keep up with reading, but rather work through the material. They generally agreed that online proctoring is a must for online courses. However, they also expressed concerns that due to issues involved with setting up online exam proctoring, they are using fewer high-stakes assessments, reducing the overall number of exams, and looking for lower-stakes methods of assessment. The main challenge with low-stake online exams through the course management system (i.e., Moodle) is uncertainty that the students are doing the exam work by themselves.

One instructor said:

I had students who were looking for a way to compensate for some poor performance and I wanted to try to offer students something and I considered the possibility of adding another exam into our schedule … and I think the way I’m going to handle it is to do an unproctored exam and so that it is sort of like an assignment because I have to assume under those circumstances they will be using notes and textbook information in order to go through the exam material. So I would say it is a factor, though it’s not huge.

Another instructor stated that:

It really is about cost of proctoring and that’s why we are challenged with thinking about fewer assessments, because it’s expensive. I mean the challenge when we talk about having fewer exams, in my class there is a tremendous amount of material we have to cover. And to reduce the number of exams, means there will be more material on each exam, so it’s not a very attractive option. I’ve thought of open book exams as another possibility, so that’s why it’s not proctored, because they can just do whatever. But in the end you do want to know that the person who is responding is the person who is doing it.

In addition to the overall themes identified above, instructors’ answers to the four questions of interest can be summarized as follows:

  • Instructors’ experience with resolution of academic dishonesty: Our respondents reported no known academic dishonesty concerning exams delivered through our online testing procedures. There was a single incident where a proctor reported suspicious behavior, but upon further analysis, the incident was found to be nothing.
  • Instructors’ experience with prevention of academic dishonesty: Our respondents expressed that online testing/proctoring is one way to assure the prevention of academic dishonesty. This way we know the person who took the exam is indeed the person for which the exam is intended.
  • Instructors’ experience with time commitments: Our respondents did not hide the fact that there is quite a learning curve with online testing/proctoring. However, by the second or third exam, faculty reported that they were comfortable with the procedures.

The first exam we had was really rocky, we had some issues, there was a password that did not match up, and there was a timing issue and students didn’t know what to expect and they were being delayed and I don’t think ProctorU was telling them the reason, so it took them a half hour/ 45 minutes to get it sorted out. So the second exam was better, and by the time we got to the third exam it went almost perfectly.

For students who had not before experienced online proctoring, the idea of someone watching them on their computer might well produce anxiety, and our students’ responses supported that assumption.

Faculty overall experience with online testing/proctoring: In addition to positive aspects of using it, faculty reported some administrative issues. One incident reported was that an online proctor cut a student off after a time period, even though it was an untimed test. Another more serious incident involved three occasions of a proctor telling a student of Muslim faith to take off her scarf, which she repeatedly refused. Our respondents stressed that the service should train their staff in a way that reflects our university’s respect for diversity. However, they also noted that the service’s staff has been responsive and accommodating, and when an issue was flagged for which they were responsible, they were quick to work with the students to find resolution.

Discussion

The results of our pre-survey of student expectations and concerns revealed several important issues that were prevalent in the three courses we studied. First, students did not express great concerns about taking their exams online. However, they did express concerns about whether their own equipment would be adequate for matching the online testing service’s technical requirements. The requirements for an operative webcam and other computer features may seem obvious to instructors and technically savvy students but complex to at least some of them and lead to uncertainty on their part. Second, we found the lack of students’ desire for scheduling flexibility to be somewhat surprising. This may be due to the alternatives students had, which gave them available times throughout the exam testing periods in addition to a choice of taking their exams in our on-campus testing center. It would likely have been a greater issue if, as in the typical situation, opportunities to take their exam were more restricted. Third, even though students’ revelations about their previous experiences with proctored online exams did not indicate this issue, a large number of them expressed concern that the proctors would distract them during the exam.Fourth, it appeared that whereas students didn’t actually expect to need the help of proctors during their exams, they attached importance to having them available. The difficulty might be the unusual situation students encounter when, instead of their instructor or at least a teaching assistant present during the exam, there is a disembodied presence monitoring them. And because the proctor is someone other than a member of the teaching staff, students fear they will be unable to get help if they need it.

The post-survey showed a number of relevant findings: First, a significant percentage of students expressed dissatisfaction with delays they experienced in actually starting their exams and indicated they suffered stress on that account. We suspect that some of this problem may be due to students making reservations on short notice prior to their exam times. This would make it difficult for the proctoring service to schedule sufficient proctors to meet demand.

Second, nearly half the students said they would have had problems coming to campus to take a traditional on-campus exam. Most of the students enrolled in the classes in this study lived near or on campus but some clearly resided elsewhere, making the online service necessary for them.
Third, students expressed general satisfaction with the flexibility provided by online testing for scheduling their exams. Even though they initially said it would not have been a problem, they apparently thought differently in retrospect.

Third, students expressed general satisfaction with the flexibility provided by online testing for scheduling their exams. Even though they initially said it would not have been a problem, they apparently thought differently in retrospect.

Fourth, echoing a concern expressed in the pre-survey, many students said their biggest dislike for the online exam was that the proctors were distracting to them. Again, this is an issue that instructors could address early on during orientation or instruction on this type of exam format.

Fifth, unlike anticipated concerns expressed on the pre-survey, students actually asked few questions of the proctors—thus not seeing that as a necessity. Most instructors strive to create exams that are clear to students and don’t require explanation. We believe this was the case for the courses in this study and students ultimately appreciated that.

Finally, there were decidedly mixed results as to whether students would choose the online testing service as opposed to an on-campus exam in their future courses. It appears that students would choose the online service primarily if coming to campus was not a good option for them. Our findings suggest that there are several things that instructors should consider when deciding to use an online testing and proctoring service.
Our findings suggest that there are several things that instructors should consider when deciding to use an online testing and proctoring service.
Our student questionnaire data indicated that online education may not be as much about distance education overall, but rather that it provides learners with the convenience that fits their work and other obligations, without which it would be very difficult for them to take our courses or complete their programs. Our instructors’ view is fairly consistent with findings from the student surveys. Instructors repeatedly stressed that there are situations where it is really nice to have online options for students. And, it comes down to what the University has in mind in offering online courses: If the objective is to broaden the availability of our courses to meet the daily life circumstances of our students, then having online proctoring is completely necessary for those students. Having undertaken this study, we argue that fully online courses require online proctoring.

Our faculty focus group revealed few problems experienced by, or greatly concerning to, instructors. It was clear that a learning process is necessary for faculty to develop procedures and expectations related to using online testing services. The instructors of the courses in this study were generally positive about the online testing service and were unanimous in their intention to continue using it. This is not to say there were no problems but they turned out to be manageable and the instructors generally agreed that the benefits certainly outweighed them.

First, we suggest that instructors thoroughly familiarize themselves with how the services work so they can anticipate students’ concerns.

Second, instructors should identify students’ technical difficulties and try to address them by spending time familiarizing students with how to get ready for and ultimately take their exams.

Third, instructors need to anticipate what scheduling issues students might have. At our university, traditional in-class exams are managed by a central office so that students know when their exams are scheduled and know that they do not conflict with their other classes’ exams. Freeing up the process with an online testing service may seem to make things easier but it’s possible that too many choices may complicate matters.

Fourth, depending on the online service they choose, instructors need to anticipate students’ fear of proctor intrusiveness and develop ways to address it. We found this concern on both our pre- and post-surveys. Different online services now offer options ranging from only live proctors, to only recorded sessions with no live proctor contact, to a mix of both. Not having a live proctor might eliminate the worry of some students that someone is spying on them, but might also make them feel abandoned if by chance they need help with the technology or clarification of an exam item. This issue is closely related to the delays in starting their exams, which some students expressed. Online exam vendors working with live proctors need to schedule an adequate number of proctors, and at very busy times wait lengths are likely to increase. Services that simply record may not have these problems. We recommend that instructors consider these issues and how they can make their exam instructions and questions as “transparent” as possible so that students don’t feel left on their own.

Fifth, instructors should endeavor to determine, early on, where students are taking the course from. Knowing how many are restricted from coming to campus (if given a choice between campus and online exams) helps with planning and avoiding problems. Determining other motives students might have for taking either online or on-campus exams may prove valuable as well.

Limitations and Recommendations for Future Studies

One limitation of this study has to do with the focus group discussion analysis. The focus group findings reported here are the results of a first phase data analysis (i.e., visual data analysis – isolation of themes and sentences). We did not deploy qualitative analysis software to transcribe and analyze our focus group findings, and we consider this as a limitation that should be addressed in future research.

To reiterate a point made above, many of our respondents reported a significant level of discomfort with online proctoring. Therefore, we think future studies in this area could consider examining antecedents such as anxiety induced by remote online proctoring and its effect on students’ exam performance. One approach might be studies that compare face-to-face and online proctoring environments from a test anxiety perspective, as well as its implications on students’ exam performance. In addition, whether online proctoring was a student’s first choice of test taking format seems to have a significant impact on how they react to the testing environment. This warrants further consideration and research. As online courses increase in number, online exams seem likely to increase as well; therefore, it will be important to make the process of taking them go as smoothly as possible for both students and their instructors.

References

Adkins, S. (2011). The US Market for Self-Paced eLearning Products and Services: 2010-2015 Forecast and Analysis. Ambient Insight.

Allen, I., & Seaman, J. (2015). Grade level: Tracking Online Education in the United States. Babson Survey Research Group.

Allen, I. and Seaman, J. (2011). Going the Distance: Online Education in the United States, 2011. Sloan Consortium. Babson Survey Research Group.

Allen, I. E., & Seaman, J. (2010). Class Differences: Online Education in the Unite States, 2010. Needham, MA: Babson Survey Research Group.

Barnes, C., & Paris, B. L. (2013). An Analysis of Academic Integrity Techniques Used in Online Courses at a Southern University. Northwest Decision Sciences Institute Annual Meeting Proceedings.

Bowers, W. J. (1964). Student Dishonesty and its Control in College. New York: Bureau of Applied Social Research, Columbia University.

Carey, K. (2016). The End of College: Creating the Future of Learning and the University of Everywhere. Riverhead Books (Hardcover).

Carstairs, J., & Myors, B. (2009). Internet Testing: A Natural Experiment Reveals Test Score Inflation on a High-Stakes, Unproctored Cognitive Test. Computers in Human Behavior, 25(3), 738-742.

Casey, D. (2008). A Journey to Legitimacy: The Historical Development of Distance Education Through Technology. TechTrends, 52(2), 45-51.

DeMillo, R. A. (2011). Abelard to Apple: The Fate of American Colleges and Universities. MIT Press.

Ercegovac, Z., & Richardson, J. V. (2004). Academic Dishonesty, Plagiarism Included, in the Digital Age: A Literature Review. College & Research Libraries, 65(4), 301-318.

Gabriel, T. (2010). To Stop Cheats, Colleges Learn Their Trickery. New York Times, 5.

Grijalva, T. C., Nowell, C., & Kerkvliet, J. (2006). Academic Honesty and Online Courses. College Student Journal, 40(1), 180-185.

Harmon, O. R., & Lambrinos, J. (2008). Are Online Exams an Invitation to Cheat?. The Journal of Economic Education, 39(2), 116-125.

Hart, L., & Morgan, L. (2010). Academic Integrity in an Online Registered Nurse to Baccalaureate in Nursing Program. The Journal of Continuing Education in Nursing, 41(11), 498-505.

Harden, N. (2013). The end of the University as We Know it. The American Interest, 8(3), 54-62.

Kinney, N. E. (2001). A Guide to Design and Testing in Online Psychology Courses. Psychology Learning & Teaching, 1(1), 16-20.

Mann, J. T., & Henneberry, S. R. (2012). What Characteristics of College Students
Influence Their Decisions to Select Online Courses?. Online Journal of Distance Learning Administration, 15(4), n4.

McCabe, D. L. (2005). Cheating Among College and University Students: A North American Perspective. International Journal for Educational Integrity, 1(1).

McCabe, D. L., Treviño, L. K., & Butterfield, K. D. (2001). Cheating in Academic Institutions: A Decade of Research. Ethics & Behavior, 11(3), 219-232.

Naglieri, J. A., Drasgow, F., Schmit, M., Handler, L., Prifitera, A., Margolis, A., & Velasquez, R. (2004). Psychological Testing on the Internet: New Problems, Old Issues. American Psychologist, 59(3),150.

Reynolds, D. H., & Weiner, J. A. (2009). Online Recruiting and Selection: Innovations in Talent Acquisition. John Wiley & Sons.

Schultz, M. C., Schultz, J. T., & Gallogly, J. (2007). The Management of Testing in Distance Learning Environments. Journal of College Teaching & Learning, 4(9), 19-26.

Shaw, D. C. (2004). Academic Dishonesty in Traditional and Online Courses as Self-Reported by Students in Online Courses.

Sterngold, A. (2004). Confronting Plagiarism: How Conventional Teaching Invites Cyber-Cheating. Change: The Magazine of Higher Learning, 36(3), 16-21.

Stuber-McEwen, D., Wiseley, P., & Hoggatt, S. (2009). Point, Click, and Cheat: Frequency and Type of Academic Dishonesty in the Virtual Classroom. Online Journal of Distance Learning Administration, 12(3), 1-10.

Tao, J., & Li, Z. (2012). A Case Study on Computerized Take-Home Testing: Benefits and Pitfalls. International Journal of Technology in Teaching and Learning, 8(1), 33-43.

Communications regarding this chapter should be sent to Daniel Woldeab, College of Individualized Studies, Metropolitan State University, daniel.woldeab@metrostate.edu

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Innovative Learning and Teaching: Experiments Across the Disciplines Copyright © 2017 by Individual authors is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.