Video: How To Scam Friends And Misuse People
Watch our new video demonstrating the ease with which audio deepfakes can be made
Like and share our video on YouTube or Twitter to raise awareness
The video, produced and released by Control AI, consists of multiple volunteers being shown how easy it is to create deepfakes of them. For each participant, a short audio clip of them is uploaded to a publicly accessible AI voice generator website, which then produces fake audio of them making statements they have not made. The fake statements are then used in phone calls that ask for money and favours from friends and family members - all of whom believe that the fake calls are real.
The participants in the video are visibly shocked and alarmed by the uncanny resemblance the audio deepfakes bear to their real voices. They are also unnerved knowing that anyone with access to an audio recording of them could use the website in question to commit fraud, theft, or even to put themselves and others in danger.
Under current UK legislation, it is totally legal to create deepfakes. Public opinion is overwhelmingly in favour of regulating deepfake technology. A YouGov poll commissioned by Control AI found that 86% of the British public want a ban on non-consensual deepfakes. Additionally, the political salience of the issue is constantly escalating - especially given that 2024 will be the first election year during which deepfake technology is widely available.
These dangers won’t go away until we have policies that hold tech companies liable for malicious deepfakes created with their technology.