The International Center For Academic Integrity Conference 2021 took place virtually for the first time. As such, more than 1,000 attendees joined the event, with a variety of parallel sessions on offer. You can check my previous blog posts for my reports about the physical ICAI conference 2020 and ICAI conference 2019.
I gave a practitioner presentation on academic integrity teaching, co-presented at a workshop on publishing academic integrity research (abbreviated video version available here) and attended lots of interesting and topical sessions. This blog post is based solely on the sessions I was able to attend live. There are lots more sessions I’d still like to dip into. A good point about the conference being virtual is that the other presentations are all archived for attendees in video format.
The discussion at this conference was also excellent. Some comments are archived on Twitter with the hashtag #ICAI2021, but a lot of discussion also took place on Whova (the app used to run the conference) and in the text chat on Zoom. One thing that stood out for me is that there were such a lot of new attendees grappling with contract cheating and use of file sharing sites for the first time. We need to find more ways to get actionable information about academic integrity out to that audience.
This isn’t the first report about the conference. The wonderful Debora Weber-Wulff was on the ball as usual and had her thoughts up within seconds of the final conference presentation concluding. As circumstances have it, Debora and I attended many of the same sessions. Then, just as I was finishing writing this post, I saw that the equally wonderful Sarah Eaton had posted her own thoughts about the conference collaborations she was involved with.
As is traditional, I wanted to collect together some of my main thoughts, post conference and reflect on the findings. I also wanted to look at the academic integrity challenges and opportunities we should be addressing as we most forward. I’ve identified five main themes, somewhat interconnected, which really stood out to me.
#1 – Academic File Sharing Sites Pose A Risk To Academic Integrity
The single most dominant theme discussed at the conference was the way students have been misusing sites like Chegg to get answers to exams and coursework questions produced for them. This isn’t a new behaviour and indeed it could be considered as a variant to contract cheating, but it seemed like many delegates had not seen discussions of contract cheating before. Much of the increase was blamed on the move to online exams in light of Covid-19 and my own research with Codrin Cotarlan has seen an increase of 196.25% in homework requests post the move online, but the real situation is always more complex.
Most of the conference discussion focused on Chegg, but this is not the only site that allows students to share university owned files online and discuss answers. Other sites operating in the same space, like Course Hero, were also covered. The impression given in the online discussions is that delegates saw Chegg as the most visible site of this type, but Chegg did also offer mechanisms to help academic integrity investigations, even if students were finding ways to circumvent these.
One of the biggest concerns educators tend to have about Chegg is how quickly answers can be returned during an examination, with one delegate expressing their frustration with questions being visible within 10 minutes of the start of their exam. Tricia Bertram Gallant and Marilyn Derby analysed the time in took to receive answers more thoroughly in relation to requests for UC Davis courses.
Tricia and Marilyn shared examples where they showed that hundreds of UC Davis students had used Chegg to cheat, including 32% of students on a statistics module. In a separate example, Kelly Ahuna and Loretta Frankovitch from University at Buffalo said that they had investigated over 100 academic integrity violations on Chegg since remote teaching began, each of which could involve multiple students. Those investigations had provided them with enough information to identify about 70 students in total. The concerns about Chegg were echoed by others at the conference, with several hundred cases known about at one university. An example of a 17 page takedown notice being necessary to address the requirements for a file sharing site to take action was given.
Example of the type of information supplied by Chegg for an academic integrity violation. This example, courtesy of University at Buffalo, shows two students posting multiple questions for the same exam.
There was general consensus that it was good that some students using file sharing could be identified, but delegates also noted how easy it was for students to hide their identify if they chose to to do. Examples were given of how students could access Chegg through Discord bots, discussion that they could use fake email accounts and VPNs and examples were shown where the questions were posted through accounts where the student claimed to be at a different university. I’ve also observed a lively trade online of buying, selling and leasing out Chegg accounts, sometimes legitimate, sometimes stolen, often at a heavy discount, but all making direct identification really difficult.
An associated question asked what the penalties should be for students accessing file sharing sites. Here universities approaches differed. Some universities did not allow students to post questions on third party sites at all, others did not allow them to look up answers. There was also a question raised about what happens in a student looks up an answer after an exam finished. In many cases, posting university owned materials online was said to be a breach of copyright and intellectual property. One amusing story was told about a student trying to negotiate the penalty they were awarded for using Chegg. The student had posted three questions, but in their defence said they had only used one of the answers.
Zachary Dixon warned that many university courses are compromised based on the volume of their intellectual property found on file sharing sites
Does the appearance of questions and answers on file sharing sites pose a real risk? Zachary Dixon said that this does and indicated that universities need to be more aware of how common unauthorised file sharing is
. Zachary proposed a “compromise metric
“, which measures how much of the content of any given course is available online. The compromise metric also considers how valuable those items are, with quizzes, tests and exams being rated as more valuable than homework style questions. Zachary found some highly compromised courses within his own university
, but a wider study across other universities would be useful to see how far this problem generalises.
There is no easy solution to file sharing sites, but it is important for the educational community to know that these sites exist, that students use them and they are marketing heavily to students. One delegate stated that Chegg offered $1200 to their Maths club if they could come in and give a pitch. That story may be anecdotal, but it seems to illustrate the value file sharing sites see in acquiring customers and getting access to more academic content through them.
#2 – The Methods Available To Detect Academic Dishonesty Are Improving
Although detection is not in itself the saviour of academic integrity, it is one tool that the community should have in its toolbox to help with integrity. The conference saw the discussion of several methods that may help here.
The need to automatically crawl and monitor file sharing sites for university owned content was discussed, with several tools developed for use at individual institutions, with the possibility for wider rollout in the future. These tools would not only provide an alert about academic integrity breaches, but also occupy time for the file sharing sites, requiring them to deal with takedown requests and to release information about the users accessing these materials.
Sometimes, just having copies of the answers given by file sharing sites can be enough. It is then just a method of matching those answers to student submissions. The suggestion was made for universities to simply pay for the answers, but Sarah Eaton does not think we should be financially fuelling that industry. I would be inclined to agree with Sarah here.
The approach of seeding wrong answers online to exam questions was muted. Such answers makes it possible to track students who access these. One of the proctoring companies was said to offer a service where they would rank wrong answers in the first pages of Google results. Other delegates talked about doing this manually, or having answers prepared to post on file sharing and contract cheating sites that allow user submitted content as soon as exam questions appear there. The wider question does have to be asked if such behaviour qualifies as entrapment. My view is that we should be ourselves leading with academic integrity, but I can see why some people would want to fight fire with fire.
We’ve begun to see some interesting work looking at contract cheating detection (as I have previously outlined in this blog post). Olumide Popoola has developed an alternative stylometric approach for identifying contract cheating, which shows a lot of promise. Olumide has also identified methods of differentiating paid for essays from those written by students. More information is outlined here in Olumide’s blog post.
Olumide collected together a corpus of over 1,000 business essays. Some of these were the free samples provided by essay mills. Others were real work submitted my students. Olumide then used stylometric techniques to identify differences between the contracted cheated essays and the student essays.
The results Olumide found match up well with other checklists of how to identify work produced through a contract cheating provider, which is positive to see. One of Olumide’s hints is to look for the use of extra words in student essays. These unnecessary additions are designed to just push the word count of essays up to meet minimum requirements (and to ensure that writers for contract cheating firms get paid quicker).
Olumide hopes to develop this work to produce a standalone tool, which will be very useful for the community. He also wants to check how well the results collected from business essays expand to other fields.
#3 – We Haven’t Got Remote Examinations Right… Yet…
This is the online proctoring system used to preserve academic integrity at the University of New England
With Covid-19 interventions being such a dominant theme, the conference hosted plenty of discussion about remote examinations and the steps institutions are taking to uphold academic integrity, particularly with students able to outsource the questions to file sharing sites like Chegg.
The issue of online proctoring raised its head, with the need to balance security and privacy.
Jennifer Lawrence and Kylie Day from the University of New England shared the perspective of a university that had developed exams to be online from well before the pandemic. They noted the extra opportunities provided to students when they could take exams remotely, making education accessible to students who could not easily attend a university setting in person.
Jennifer and Kylie shared an example of what remote examination at their institution looks like, from both the test taker and invigilator point of view. The invigilators work in an office environment, each monitoring around six students. The students work in their home (or other preferred) environment and are monitored through their cameras.
Yes, the situation could be considered invasive, but Jennifer and Kylie’s view is that students are also being closely monitored in an exam hall. And these students knew what they were signing up for.
The situation is less clear-cut when changes made at other universities as a result of the pandemic are considered. In this case, not all the students knew they would be monitored at home when they signed up their course. Those students may not even have access to a private environment. There are also issues when artificial intelligence based monitoring solutions are used, especially when these solutions may include elements of bias.
How important should we take assessment security? Jennifer and Kylie discussed the in-person example where students need to be searched before exams to check for hidden earpieces (something we found to be a known problem in research around South East Europe). I largely agree with Jennifer and Kylie here, but I would have to question how thoroughly students are searched before taking exams in many institutions. Another possible comparison would be with professional exams. With these, not only are candidates searched on entry, but they also take the exams in a closely monitored environment, with cameras recording all the way through. In some ways, the remote alternative does not sound anywhere invasive as the situation students will face post graduation.
The debate will continue, but perhaps exams are just not the best assessment method to use in a post Covid-19 world. We do need to be considering alternatives.
#4 – We Need To Rethink How We Teach And Assess Maths
Students now have access to tools like Mathway to quickly solve their maths problems
There wasn’t an official presentation on this topic, but the same issue was raised several times in discussions by different delegates.
The landscape for maths support and help has changed substantially over the past five years. It is now simple for students to download apps, take a photo of a maths question and get back a worked solution.
This problem of maths solvers isn’t brand new. I remember warning about WolframAlpha, which offers a similar service through a web interface, ten years or so ago. But what has changed is how well known apps and mobile friendly websites like Photomath and Mathway are and how readily these appear to be being used in examination situations.
Now, some people will dismiss this as just a problem for students on Mathematics degrees, but the consequences are much wider reaching. So many courses have a maths component, particularly in the early years of study. Other courses require maths proficiency throughout. As Alexander Amigud and I showed when we analysed the demand for contract cheating services, students from across many disciplines offer to pay for answers to maths problems. One interesting study would be to see how far those paid for maths answers are unique and how far service sellers simply resort to using maths apps for themselves.
The number of students cheating using maths apps is hard to quantify, but one delegate said 45% of their students used cheating maths apps for an online proctored trigonometry exam. They were able to position their phones outside of the view of the monitoring cameras.
A positive note is that detection of the use of these apps is possible in some situations. The apps do not always solve problems in the same way a human would, or follow the formats taught in class. And if many students hand in identical answers and working, as generated by the apps, that is suspicious. But apps improve all the time and academic integrity breaches can be hard to prove. Mathematics is definitely an area where a rethink of teaching and assessment strategies is needed.
#5 – Partnering With Students Offers Us A Way To Move The Academic Integrity Community Forward
Cath Ellis encouraged universities, quality bodies and individuals to share information and develop strategic partnerships around academic integrity
A very positive theme emerging from the conference was ways for student academic partnerships in academic integrity to develop.
Staff-student partnerships is a theme I’m passionate about. I’m working on a book chapter discussing partnership opportunities in more detail. My recent paper with Codrin Cotarlan came about as an Imperial College London StudentShapers partnership.
Two delegates shared examples of useful software they’d had produced by students.
Debora Weber-Wulff showcased a text comparison tool, which highlights similar text from two documents in a side-by-side colour coded way. Very useful when trying to demonstrate to students that just changing a few words here and there is still plagiarism, or when documenting a case for an academic misconduct hearing.
Zachary Dixon’s CourseVillain tool, designed to monitor and request the take down of unacceptable posts on Course Hero, also came about from hiring a Computer Science student for a summer programming project. Zachary encouraged other universities to look into ways to pair with Computer Science students as well.
Cath Ellis and Kane Murdoch spoke enthusiastically about the need to build academic integrity communities. In my own presentation I talked about finding different ways to engage students as our partners. In my talk I said how I considered students as being well-suited to conduct academic integrity research. Several students co-presented at the conference and took part in panels. In spite of the emerging challenges to academic integrity we’ve seen, the student movement shows that the overall future of society is still in safe hands.