Deepfakes and deception: how accountants can protect themselves and clients

Deepfake technology has moved from sci-fi to reality and it’s a growing threat to your business. As a trusted professional managing sensitive transactions and data, spotting and outsmarting these AI-driven scams is now part of protecting your firm and clients.

by | 3 Dec, 2024

A 3D render of a human head

At a glance

  • AI-generated deepfakes exploit trust and outdated verification, targeting accountants directly.
  • Advanced authentication and zero-trust policies are crucial safeguards.
  • If targeted, act fast to cease contact, save evidence and report the incident.

Deepfakes – AI-generated video, audio or images that mimic real people — are a growing weapon in financial fraud. For accounting professionals, they represent a serious threat, targeting the very trust on which your work relies. In fact, a recent State of Information Security Report showed 34 per cent of UK finance firms have faced deepfake-related security breaches in the past year.

No longer limited to manipulating public figures such as Prime Minister Keir Starmer in viral videos, fraudsters are now using deepfake technology to impersonate executives and trick staff into authorising payments, accessing bank accounts or releasing sensitive information. Earlier this year, UK engineering firm Arup was duped into sending £20 million after an employee had a call with whom he thought was his colleagues. 

The last two decades have seen rapid improvements in generative AI, making deepfake technology cheap, accessible and increasingly sophisticated. With minimal expertise, scammers can create highly convincing forgeries. Real-time deepfake technology takes this further, enabling live manipulation of audio and video during calls – imagine a “client” approving an urgent transfer of funds during a Zoom meeting.

Risks for accountants

With access to sensitive client data and the authority to approve high-value transactions, fraudsters see you as a direct route to significant financial gain. Deepfake technology amplifies this risk by exploiting your reliance on face-to-face and voice-based communication, particularly in today’s remote working environment.

This exposure not only threatens your financial security but also risks finance professionals becoming unwitting participants in fraudulent activities.

Beyond direct fraud, deepfakes also pose significant reputational risks, particularly for firms or clients with a strong public profile. The primary misuse of deepfakes has, to date, been to influence public opinion by impersonating public figures, according to Google’s DeepMind division. Damaged professional reputations or manipulated client perceptions could lead to loss of trust and potential legal consequences.

How can you manage deepfake risks?

Jack Garnsey, Subject Matter Expert at VIPRE Security Group, says managing the risks posed by deepfakes requires accountants to combine awareness, detailed procedures and cutting-edge technology to safeguard themselves, their firms and their clients. 

Headshot of Jack Garnsey
Jack Garnsey, Subject Matter Expert, VIPRE Security Group

1. Strengthen internal processes

“Accountancy firms must adopt a multi-layered approach to security that combines vigilance with robust procedural safeguards…and have a good understanding of the tactics that criminals employ to gain access to sensitive information. They must understand their organisation’s security policies and controls, so that in the event of an atypical situation or a potential attack, they are intuitively watchful, can identify abnormal activity, and take the necessary safeguards immediately,” Garnsey says.

Firms should require secondary methods of verification for any significant transactions, particularly those conducted via video or audio. A follow-up phone call to a verified number or secure digital signatures can prevent deepfake-enabled fraud.

2. Embrace advanced authentication technologies

Dominic Forrest, CTO at iProov, agrees the increasing reliance of firms on digital tools and online communication makes strong identity verification essential and deepfake fraud regulation difficult.

“The emergence of readily available generative AI tools to create highly convincing deepfakes presents a new and significant threat. Firms should embrace advanced authentication technology that incorporates biometrics with liveness detection. This ensures that the client is genuinely present and who they claim to be,” Forrest says.

Headshot of Dom Forrest
Dominic Forrest, CTO, iProov

These measures are particularly critical at high-risk touchpoints such as onboarding, account recovery and digital document signing. Encryption tools and proactive risk management across these processes should be non-negotiable.

3. Adopt a zero-trust approach

Garnsey emphasises the importance of zero-trust principles.

“Strictly adhering to policies and processes is essential, especially those related to passwords and authentication,” he says. “Strong passwords, banned re-use of passwords, multifactor authentication and so forth are all key.

Perhaps most critical is the application of ‘separation of duties’, so that no single individual has control over systems, processes, financial decisions and so forth.”   

Responding to a deepfakes scam 

If you suspect a deepfake scam, experts agree hiding the breach isn’t an option. Garnsey advises it’s critical to act decisively to limit the impact. “Stop all contact and communication immediately,” he says. Further engagement risks making things worse.

Save as much evidence as you can – dates, times and the technology platforms used. This will help with subsequent investigations into the incident. 

Report and escalate the incident, following the established reporting procedure. This will ensure that all the appropriate teams are notified, including team members, cybersecurity experts and legal counsel.”

There’s no doubt deepfakes are rewriting the rules of trust in the digital age – protecting yourself and your client relationships means sharpening your defences before it’s too late.


Visit the IFA’s CPD programme, including Future Proofing your Practice, here. 

Share This