fb tracking

Two-Thirds of States Enact Bills Protecting Public from Deepfake Porn

WASHINGTON, D.C. — Yesterday, Oklahoma’s governor approved a new law making it illegal to create and disseminate non-consensual intimate deepfakes — realistic images and videos created using technology that depict a person nude or engaging in a sexual act.

With Oklahoma’s law enacted, a full two-thirds of U.S. states now have laws on the books protecting the general public from this type of AI-generated deepfake, which can cause serious, life-long harm to innocent people. Four additional states have laws protecting only minors. The vast majority of these laws have been passed in the last couple of years.

“No one should have to experience being the victim of a non-consensual deepfake, or worry that they or their child might be,” said Ilana Beller, democracy organizing manager with Public Citizen. “People often assume that there are already laws that protect people from this type of abuse, but without new legislation focused specifically on AI threats, that is generally not the case. It is heartening to see state legislators taking this issue so seriously. They are transcending partisanship and acting swiftly to protect their constituents.”

With rising public awareness of what AI-generated intimate deepfakes are and the trauma that victims — mostly women and girls — experience, states have made rapid progress on legislation to protect people. So far this session, 39 states have introduced legislation to regulate the creation and dissemination of so-called “deepfake porn.”

Public Citizen created a model law to assist lawmakers in taking action on this issue and is tracking the progress of bills in our intimate deepfakes legislation tracker, which is updated regularly.

For more information on this issue, or to speak with an expert, contact eleach@citizen.org