An underground website called OnlyFake is claiming to use “neural networks” to generate realistic looking photos of fake IDs for just $15, radically disrupting the marketplace for fake identities and cybersecurity more generally.In our own tests, OnlyFake created a highly convincing California driver's license, complete with whatever arbitrary name, biographical information, address, expiration date, and signature we wanted. The photo even gives the appearance that the ID card is laying on a fluffy carpet, as if someone has placed it on the floor and snapped a picture, which many sites require for verification purposes. 404 Media then used another fake ID generated by this site to successfully step through the identity verification process on OKX. OKX is a cryptocurrency exchange that has recently appeared in multiple court records because of its use by criminals.
Yes, but do you really need a neural network to create a fake ID? OnlyFake makes you upload your own photo. There's practically nothing to generate at that point. From there, I imagine it's just an image compositing program, like a fancy version of
Link Finds.
While OnlyFake says it uses “neural networks” to create its fake IDs, 404 Media has not seen evidence that the service uses generative AI tools. “I don’t know exactly what is going on here but I suspect that they are using some tech to insert/replace the image into a template of a license/id,” Hany Farid, a professor at UC Berkeley and one of the world’s leading experts on digitally manipulated images, told 404 Media in an email. “If they were using genAI to create whole cloth the entire ID, they would have trouble dealing with inconsistencies in the background.”
Yeah. I mean, I'm sure there's a lot of work they had to do to get the details right, but I think this grift is a better fit for normal programming than for ML. But AI sells, whether the audience is VCs and/or criminals.