Those who create sexually explicit ‘deepfakes’ will face prosecution under a new law, the Ministry of Justice has proposed.
Ministry of Justice Minister for Victims and Safeguarding, Laura Farris, said: “The creation of deepfake sexual images is despicable and completely unacceptable irrespective of whether the image is shared. It is another example of ways in which certain people seek to degrade and dehumanise others – especially women. And it has the capacity to cause catastrophic consequences if the material is shared more widely. This government will not tolerate it. This new offence sends a crystal clear message that making this material is immoral, often misogynistic, and a crime.”
As part of the Criminal Justice Bill, which is passing through Parliament, the Government is also creating new criminal offences against those who take or record intimate images without consent – or install equipment to enable someone to do so. The Online Safety Act last year criminalised the sharing of ‘deepfake’ intimate images.
Comment
Jake Moore, Global Cybersecurity Advisor at the cyber firm ESET, said: “Although criminalising the creation of deepfake pornography might not completely eradicate the problem, it serves as a strong deterrent – particularly to younger people – by highlighting the extreme harm it causes to those involved as well as the repercussions for the creators.
“Major corporations that provide deepfake technology via off-the-shelf products are aware of the problem and have implemented strategies to eliminate the production of such pornographic content. However, AI software can be kept underground and developed by anyone who has enough time, money and training data.
“Deepfake pornography can have a traumatic effect on victims and as technology improves, it worryingly becomes easier for someone to create any footage of anyone they want. In fact, 50pc of Brits are worried about becoming a victim of deepfake pornography, and one in ten (9pc) reported either being a victim of it, knowing a victim, or both.”





