Deepfake victims must punish Big Tech because Congress won’t

By Dr. Lyndon Haviland

April 12, 2024

Concern over AI deepfakes has largely focused on their use in perpetrating election interference in this year’s U.S. presidential race. But they raise a more depraved problem that should scare all of us: deepfake pornography, where software programs accessible by a simple online search can turn an innocent image of an unwitting individual into a sexualized scene or video that can be posted online without consent.  …

Social media companies have proven they can’t police themselves, but Congress can. They can amend the Communications Decency Act to hold social media companies liable when deepfake pornographic images are published on their platforms. They can pass bills, such as the Preventing Deepfakes of Intimate Images Act and the Shield Act, which would make the circulation of deepfake pornography a crime. They can pass the Defiance Act, which would enhance deepfake pornography victims’ rights.  

But Congress has gutlessly failed to act on any of these measures.

As long as Congress remains impudent in standing up to them and as long as social media companies fight tooth and nail to evade responsibility, victims should drown them in litigation.  

Maybe then, and only then, they’ll get the message that the burden is on them to solve this crisis.