Skip to main content

CASEWORK NOTE: COMPLAINTS RELATING TO AI AND ACADEMIC MISCONDUCT

Generative Artificial Intelligence (AI) is an advancing technology and conversations about its place in higher education are nuanced and fast-moving. There is a growing public expectation that higher education students will have opportunities to develop AI skills in preparation for future employment, but there is less consensus about how AI should be used in student assessment.

We have published some case summaries relating to use of AI. Almost all of the complaints we have received about the use of AI have been from students who have been subject to an academic misconduct procedure. The number of these complaints remains low. From our conversations with student representative bodies (SRBs) and higher education (HE) providers we understand that the incidence of academic misconduct linked to AI is rising within providers’ internal procedures. We know that many providers’ procedures allow for an educative rather than punitive approach for minor or first instances of academic misconduct and we consider this to be good practice. It may be that students are less likely to pursue a complaint to us in circumstances where an academic misconduct process has not resulted in a heavy penalty or has provided an opportunity for them to develop their understanding of what is permitted.

The data we have about student complaints cannot show whether some students’ use of generative AI in assessment is going undetected by providers. It is important that providers continue to engage their students in discussions about what AI should be used for, and what is not acceptable. HE providers have a responsibility to ensure their assessments are fair, valid, and reliable. They also have a responsibility to investigate fairly any concerns that could affect the academic standards and integrity of their qualifications.

The principles set out in the Good Practice Framework: disciplinary procedures apply to all types of academic misconduct, including inappropriate use of generative AI in assessed work. Some providers have developed specific AI policies, while others incorporate AI guidance into existing academic misconduct processes, assessment briefs and regulations. Either approach can be effective. While providers may describe an overall approach in their policies, it will often be necessary to set out more detailed guidance for individual disciplines, courses or modules. It is essential that in all cases providers make it clear to students what is and what is not acceptable use of AI in learning and assessment, and what is considered to be academic misconduct.

Notifying students

When a student is thought to have committed academic misconduct using AI, they should be told in writing what it is that they may have done that is a breach of the assessment rules, and why the provider thinks this. They should also be given sufficient notice of meetings and be provided with all relevant evidence, including detection software reports, to allow them to respond effectively to allegations. Some students may need additional support to navigate the process and should be signposted to any available support, for example through a students' union adviser or equivalent.

Investigating alleged breaches

The responsibility is on the provider to prove that the student has done what they are accused of doing, not on the student to disprove it. It is good practice for providers to consider whether the whole or part of a submission is thought to be AI-generated, and clearly set out what aspect of the assessed work led to the suspicion that AI had been used inappropriately.

Providers should consider a range of evidence. It is important that decision-makers understand the strengths and limitations of detection software, and weigh this evidence carefully against other available information.

Providers may decide to carry out a viva or use another mechanism to test the student’s understanding of the work that was submitted. It is important that students understand that the focus of the viva will be on the content of the assessment. It will usually be appropriate to take account of how long ago the assessed work was completed when evaluating the student’s knowledge in this kind of viva.

As well as exploring the student’s knowledge of the topic, providers should give students the opportunity to explain how they worked to prepare their submission. It can be helpful to explicitly ask students to supply copies of any notes or drafts. If providers are using the metadata associated with electronic files as evidence, it is good practice to explain what they think the metadata shows and give students an opportunity to respond.

Providers may compare the work under scrutiny to other assessed work previously completed by the student. It is important to be transparent about which pieces are being considered so that students can comment on whether and how their approach to the earlier work was different.

Language and disability considerations

Providers should consider whether assumptions about AI use could be biased against a student's writing style, for example if the student is disabled or has a communication difference, or if English is not the student's first language . It may also be relevant to consider whether these factors could have affected the student’s understanding of what AI use was permitted.

Decision making and record keeping

Decision makers should give clear reasons for their findings. Where the provider decides, on the balance of probability, that academic misconduct has taken place it should explain its choice of penalty, explaining why any lesser penalties were not considered suitable. Providers should keep reliable and accurate records of all meetings and proceedings to ensure that processes are transparent and that appeals can be made effectively.

AI and academic misconducut

Learning from our casework on complaints relating to AI and academic misconduct.

Case summaries

Case summaries on complaints relating to AI and academic misconduct. These were released in conjunction with the casework note.