I’ve been quite vocal on the impact of deepfakes, in terms of where the most harm takes place. Back in 2019, we looked at malign interference campaigns. I took the line that, other than revenge porn, this was where deepfakes were likely to have the most influence. Although people keep talking about major election interference, nothing of significance ever happens. Indeed, election fakes tend to be pretty bad.
Meanwhile, in smaller scale but significantly more personal cases, horrible fakes of teenagers were the order of the day. When you make fakery easily available to all on DIY mobile apps, the results are inevitable: People are going to be awful to one another. Deepfake shenanigans are primarily all about mass producing harmful fake porn of individuals without consent.
On that subject specifically, there’s news of yet another site offering easy DIY deepfake porn.
The beginnings of a Deepfake empire?
The unnamed site in question uses AI to generate nude images of women. Sites in the past along these lines have tended to operate in isolation. This time, the site is using “partner agreements” and referral systems to generate look-alike services. If one site goes down, others are ready and waiting to take its place.
Researchers claim the images are “hyper realistic” and are able to generate nude / pornographic imagery even if the photo submitted contains fully clothed individuals. Site operators say they’re building a decentralised model to help ward off the threat of takedowns while raking in the cash. Wired reports up to 50 million visits between January and October of 2021. One day alone apparently saw hundreds of thousands of image uploads run through the fakery tool. These are big numbers, with big money implications.
When action started to be taken against the main site with payment accounts suspended and hosting removed, numbers fell, which seems to have kickstarted the partner program drive. Wired states that a spin-off site operator claims to be paying about $500 to the main site in return for being able to generate up to 10,000 naked edits.
With the traffic numbers these sites are doing, many would view $500 as a small outlay to generate so many fakes. The spin-off sites funnel image creators down the payment route after allowing visitors to generate some free images initially. It’s a guaranteed money spinner, and fake DIY sites aren’t exactly difficult to find online. As many sites and creators go off and promote their content on social media, it’s becoming increasingly easier to find dubious services along these lines and make use of them.
Where does the deepfake harm lie?
The majority of non-consensual deepfake imagery targets women, and always has done. For every vaguely humorous fake of Tom Cruise being Tom Cruise, there’s a significant amount more women placed into content they want no part of. Laws continue to struggle with dealing with the problem. With anonymous creators generating thousands of images on the fly in other jurisdictions, it’s an uphill struggle to take the reins on the situation.
The genie’s bottle: broken
Deepfakes appear to be seeping into most aspects of technological life. Witness someone resurrect their father, then be utterly mortified by what they’ve done. You’ve got those who continue to talk about the risk it poses to business. Elsewhere, the tattered remnants of “deepfakes could derail the US elections” continue to burn out quietly in the corner.
For most everyone else, though, the only real probable harm is from what pretty much kicked things into the mainstream arena in the first place: Pornographic images created without permission. I’m willing to bet that’s going to be the biggest issue for a long time to come.