Case Studies

Sexually explicit deepfakes: women are more likely to be exploited

by Mark Rowe

Abuse of artificial intelligence tools leads to sexually explicit material of real people, according to a survey for a cyber firm. It collected data from over 2000 Brits and found that half were worried about becoming a victim of deepfake pornography, and near one in ten, 9 per cent, reported either being a victim of it, knowing a victim, or both.

The anti-virus and internet security product company ESET points to the rising problem of deepfake pornography, as recently highlighted by explicit deepfakes of the US singer Taylor Swift being viewed millions of times. The firm reports a new form of image-based sexual abuse in the UK, quoting at least 60pc of all revenge pornography victims being women (according to the UK Council for Internet Safety). In the rcently-passed Online Safety Act, creating or inciting the creation of deepfake pornography became a criminal offence. However, the survey suggests that this has not done much to alleviate fears around the tech, as most, 61pc of women were reporting concern about being a victim of it, in comparison to less than half (45pc) of men.

Near two in five (39pc) of those surveyed by Censuswide believe that deepfake pornography is a significant risk of sending intimate content, yet about a third (34pc) of adults have still sent them. Of those that do, the research suggests that a majority, 58pc regret sharing them, whether they say ‘Yes, I would never send an intimate photo or video again’ or ‘Yes, but I would send an intimate photo or video again’.

The percentage of people sending intimate images or videos drops to 12pc in the under-18s, perhaps due to the fact that a majority, 57pc of teenagers surveyed are concerned about being a victim of deepfake pornography.

Despite interest in deepfakes soaring, people are still taking risks, the firm suggests, as just under one third (31pc) admitted to sharing intimate images with their faces visible. The research found that the average age at which someone receives their first sexual image is 14.

Jake Moore, Global Cybersecurity Advisor, ESET said: “These figures are deeply worrying as they show that people’s online habits haven’t adjusted to deepfakes yet. Digital images are nearly impossible to truly delete, and it is easier than ever to artificially generate pornography with somebody’s face on it. Women are disproportionately targeted more often by abusers looking to exploit intimate images, and the prevalence of deepfake technology removes the need for women to take the intimate images themselves. We’re urging the government to look beyond the Online Safety Act and address this critical risk to women’s safety and security.”

The survey also asked about experiences with sharing intimate images online. A third (33pc) of all women surveyed reported that explicit images shared have been misused. Of these, a quarter (25pc) were threatened with posting these images, and 28pc have had their photos posted publicly without permission. Women are reluctant to seek help, with just 28pc saying they would go to the police if someone misused their images online.

The survey found a widespread misunderstanding of the law around sexting, as 44pc of those surveyed mistakenly believed it is legal to incite or encourage someone to send sexual images if they themselves are under 18. WhatsApp (37pc) was the most used platform for sharing intimate images, with Snapchat (30pc) second; despite the fact that nine in ten (89pc) of people are aware that messages could be screenshotted.

Among the cyber firm’s pieces of advice is that the risk of your likeness being used in deepfake pornography is directly linked to how easy it is to find images of your face. Turning social media accounts to private, while being careful about who you let follow you, is the best way to reduce the likelihood of being targeted, the firm suggests.

Related News

Newsletter

Subscribe to our weekly newsletter to stay on top of security news and events.

© 2024 Professional Security Magazine. All rights reserved.

Website by MSEC Marketing