Steps to help combat fraud in which criminals use AI-generated replica of a person’s voice to deceive victims

T

he voicemail from your son is alarming. He has just been in a car accident and is highly stressed. He needs money urgently, although it is not clear why, and he gives you some bank details for a transfer.

You consider yourself wise to other scams, and have ignored texts claiming to be from him and asking for cash. But you can hear his voice and he is clearly in trouble.

However, the voicemail is the latest way in which criminals are using technology to defraud people. By taking a tiny snippet of real audio – just three seconds is enough – from a person, they can “clone” the individual’s voice using freely available AI technology. From there, they can make an recording of the synthesised voice saying exactly what they want.