Skip to main content

Deepfaking emerges as “small but growing” trend in student interviews

Enroly, a leading platform that streamlines the onboarding and arrival process for universities, students, and agents, has reported growing instances of ‘deepfaking’ being used in credibility interviews.

The company has assured its partners that it remains ahead of the deepfake threat – manipulated content designed to replicate a person’s appearance or voice – and is continually adapting to counter any risk.

The company reported zero cases of deepfaking flagged in interviews during the UK’s September 2025 intake, but saw multiple attempts in January 2025.

Out of approximately 20,000 interviews in the 2025 January intake, 1.3% of cases involved “outright deception”, according to Phoebe O’Donnell, head of services at Enroly.

“Some of the things we’ve uncovered in student interviews are straight out of a sci-fi film – lip-syncing, impersonation, and even deepfake technology… Welcome to the future of fraud, folks,” wrote O’Donnell in a blogpost addressing the issue.

Some of the things we’ve uncovered in student interviews are straight out of a sci-fi film – lip-syncing, impersonation, and even deepfake technology

Phoebe O’Donnell, Enroly

Enroly breaks down the cases of deception in the January 2025 intake, noting that 0.1% cases used “third party support”, 0.6% cases used “lip-syncing” and 0.15% using “deepfake attempts”.

“Deepfakes were the biggest surprise… They’re the stuff of nightmares for interview assessors – fake faces layered over real ones, complete with expressions and movements. It’s like something out of a spy film. And yes, they’re incredibly hard to detect,” said O’Donnell.

“But hard isn’t impossible. Thanks to real-time tech and a few clever tricks up our sleeves, we’ve already stopped several attempts. It’s a small but growing trend, and we’re determined to stay ahead of it and work with our partners and the wider sector,” she assured stakeholders.

“Fraudulent practices will keep evolving, but with the right combination of technology and expertise, universities can stay one step ahead,” O’Donnell added, announcing a webinar for institutions keen to learn more.

Speaking to The PIE News, a UK Home Office spokesperson highlighted the department’s measures to prevent such fraud in its own processes, and the penalties for those who are caught.

“We have stringent systems in place to identify and prevent fraudulent student visa application,” the spokesperson said.

Any individual attempting to cheat or use deception will not succeed

UK Home Office spokesperson

“Any individual attempting to cheat or use deception will not succeed and may face a ban from applying for UK visas for 10 years. We will also continue to take tough action against any unscrupulous companies and agents who are seeking to abuse, exploit or defraud international students,” they continued.

In 2023, The PIE reported on the threat of deepfakes to English test security. While key stakeholders described the risk as “minuscule”, citing the high technical barrier to entry, companies such as Duolingo English Test were taking proactive measures to safeguard tests, appointing a lead engineer specialising in deepfake technology, while leveraging both AI and human proctors to detect signs of external interference.

The post Deepfaking emerges as “small but growing” trend in student interviews appeared first on The PIE News.