The Security Threat of Voice Cloning | Avast

Advances in voice cloning have brought computer-generated audio to a level the BBC reports “is now said to be unnervingly exact” and that some experts believe may constitute a substantial security hazard. The AI-led technology learns and adapts on its own, and it has evolved greatly over the past few years. The newest iterations of the software can assimilate not just one’s accent, but also their timber, pitch, pace, flow of speaking, and breathing. Moreover, the cloned voice can be manipulated to express a range of emotions including anger, fear, happiness, love, or boredom. 

Leave a Reply