Stop Deepfake Scams Before They Stop You
Artificial intelligence has unlocked powerful opportunities across Africa, but it has also armed cybercriminals with a dangerous new weapon: deepfakes. By cloning voices and faces, fraudsters are now impersonating executives, employees, and even family members with chilling accuracy.
The African Reality
In South Africa, incidents of deepfake-linked fraud spiked more than twelve-fold in 2023, making it one of the fastest-growing cyber threats in the region (TransUnion). Nigeria, Kenya, and other East African markets have also reported sharp increases, particularly as digital banking and mobile payments expand.
Banks are sounding the alarm. FNB recently warned customers about AI-generated voice and video scams that impersonate bank staff or even family members, tricking victims into transferring money (MyBroadband).
In Kenya, cybercriminals are already using AI-forged voices and identity documents to try and bypass biometric security checks — part of a broader surge that saw generative AI behind more than a third of all new biometric fraud cases in Africa in 2024 (Microsoft Source EMEA).
For businesses, the risk is clear: fraudsters are no longer just sending suspicious emails. They can now appear on a video call as your CFO or sound like your CEO on the phone. And in regions where mobile adoption is high and remote communication is the norm, this threat is especially dangerous.
Why Deepfakes Work
Deepfakes succeed because they exploit trust. Employees are taught to act quickly when executives give instructions, especially if there is urgency. A cloned voice saying, “Approve this payment immediately, we’re closing a deal,” can override natural scepticism. Unlike traditional scams, these impersonations don’t always come with poor spelling or strange email addresses. They feel real, human, and urgent, which is why even experienced staff can be caught off guard.
Simple Ways to Verify Calls
- Pause and verify: If a call, video, or voice note feels urgent, slow down. End the call and reconnect using an official number or channel you already trust.
- Use internal pass-phrases: Establish a confidential code word for executives and teams that must be shared before sensitive instructions are followed.
- Require dual approval: For payments above a certain amount, enforce a policy that at least two people must validate the request.
- Spot the glitches: Deepfake videos may struggle with blinking, lighting, or lip movements. Don’t ignore visual “tells.”
- Educate your team: Regular awareness sessions are vital. Staff need to know that hearing or seeing someone is no longer proof of identity.
Deepfakes are here, but they don’t have to succeed.
