The UK’s Bold Move: Criminalising Sexually Explicit Deepfakes to Protect Privacy and Dignity
The UK government has announced plans to criminalize the creation and sharing of sexually explicit “deepfake” images without consent. This initiative is part of a broader effort to address the increasing misuse of artificial intelligence to produce non-consensual intimate content, which predominantly affects women and girls.
Deepfakes involve the use of AI to create realistic but fabricated images or videos, often superimposing an individual’s likeness onto explicit material without their consent. Current laws, such as those addressing revenge porn, do not adequately cover these fabricated images, necessitating new legislation.
The forthcoming measures will introduce specific offenses for creating and sharing non-consensual deepfake images, as well as for taking intimate images without consent and installing equipment intended for such activities. Perpetrators could face up to two years in prison. These provisions will be included in the upcoming Crime and Policing Bill.
This legislative move follows a significant increase in deepfake-related abuse, with data indicating a 400% rise since 2017. The justice ministry has emphasized the serious harm caused to victims and the necessity of these measures to combat online abuse and protect individuals’ privacy.
Additionally, the Online Safety Act, passed in September 2023, prioritizes the sharing of intimate images as an offense, holding platforms accountable for the removal of such content under the oversight of Ofcom.
Victims of deepfake pornography have faced severe emotional and psychological distress, as highlighted by recent cases where individuals’ images were maliciously superimposed onto explicit content and shared online without their consent. These incidents have underscored the urgent need for legal reforms to address the creation and distribution of non-consensual deepfake material.