Deepfake Pornography and the Law
Updated: Jan 16
Deepfake pornography refers to a porn video in which the face or body of the actor(s) has been “digitally altered so that they appear to be someone else, typically […] maliciously or to spread false information”. Deepfake, in general, has existed since the digital era began although it was predominantly used on celebrities and politicians for comedic effect.
Deepfake has now evolved to exist in the realm of pornography, amongst other areas including entertainment and attempting to influence politics.
Kate Isaac’s experience of deepfake pornography
Recently, Kate Isaacs fell victim to deepfake pornography whereby her face was digitally transposed onto the body of a porn actress’.
In 2019, Kate founded the #NotYourPorn campaign, “a UK-based, sex positive movement fighting to protect nonconsenting adults, sex workers and under 18s from image-based sexual abuse”.This campaign led Pornhub to take down all videos uploaded by unverified users to the website, representing most of its content. As a result, Kate assumed that the video in which her face was deepfaked onto a porn actress’ body was the result of a user’s frustration at having their video taking down from the website.
This went a step further when Kate’s home and work addresses were shared below the video in the description, compromising her and her family’s safety, constituting ‘doxing’. Doxing is known as the “action or process of searching for and publishing private or identifying information about a particular individual on the internet, typically with malicious intent”.
Kate’s experience was truly horrific and traumatising. She understandably feared for her safety but also for the security of her employment and for her reputation, what people might think of her in her personal and professional life. This type of (digitally altered) video can have such damaging effects and it can be very difficult to convince others that the video is not something you agreed to participate in. Particularly, in the 21st century when our technology is of such high quality, it can be nearly impossible to distinguish a deepfake from a legitimate video.
However, this is not an isolated case. Kate is one of many women who have fallen victim of deepfakes and in her case, deepfake pornography. So how can we prevent this from happening again or punish offenders appropriately?
The law surrounding deepfake pornography in the UK
Turning to the law, deepfake pornography falls under the same umbrella category as revenge porn, that of image-based sexual abuse, a criminal offence whereby “someone shares sexually explicit images or videos of another person without their consent, and with the aim of causing them distress or harm”.
Scotland has already created an offence for sharing images or videos that show a person in a non-consensual intimate situation. In the rest of the UK however, the offence will only be made out if it can be proven that such actions were intended to cause the victim distress. This creates a loophole for the perpetrators of this offence to avoid conviction because of the subjective nature of the criteria that ought to be made out in order to convict a perpetrator of the offence. Indeed, it is difficult to prove, on any level, an intention to cause the victim distress. The perpetrators could invoke the fact that it was a joke gone too far, or that it was meant for entertainment, and they never meant to cause any harm, or similar, and be acquitted very easily. Also, the burden of proof would fall to the prosecution to prove the offence beyond a reasonable doubt. Therefore, the prosecution would need sufficient proof to convict the perpetrator[CJ1] , the person who created and distributed the deepfake pornography.
In the UK, the Online Safety Bill has been in progress for many years but was repeatedly shelved and postponed in order to prioritise other Bills. The Bill is now set to come into effect in early 2023 following a push to prioritise the protection of victims such as Kate Isaacs. Up until now, with no legislation in place surrounding deepfake pornography, it has been “difficult to prosecute offenders, and the law has struggled to keep up with rapid social media innovation, which is also inherently difficult to police”.
The Bill is set to make deepfake porn illegal in the UK, amongst other harmful digital practices. Kate Isaacs recently took to Instagram to comment on the news of the criminalisation of deepfake pornography and said: “The proposals, especially the Government’s focus on the lack of consent as the base for culpability are a step in the right direction however intention is not action [and] these laws need to be made with the mindset of getting it right the first time.”
As well as criminalising deepfake pornography, the law will make it easier to charge those “sharing intimate photos without consent”, commonly referred to as revenge pornography.
However, when the Bill was initially drafted three years ago, deepfake pornography was seen as something that could only be achieved by professionals, not merely by using an app downloadable from an App store, as it is today. Therefore, there are serious concerns that the Bill will not be able to keep up with the rapid evolution of technology.
Possible solutions: looking to the future
With a reported 1 in 14 adults in England and Wales having been threatened with the sharing of their intimate images online without consent, this problem is indeed very serious and requires careful consideration to ensure that victims are protected as much as possible and potential perpetrators are deterred from posting non-consensual intimate images, creating deepfake footage and more.
One solution is to improve reporting tools for victims so that perpetrators are identified as promptly as possible and given an appropriate sentence. As such, to protect the victim and encourage them to speak up, the anonymity tool is suggested. The anonymity tool is defined as a “software that keeps the identity of the user private”. Professor Clare McGlynn at Durham University, an expert in image-based sexual abuse says that it was “absolutely vital that anonymity is granted immediately”. Indeed, victims of image-based sexual abuse will presumably already be shaken enough; if they have the courage to speak up for themselves and for others, we must make sure that they feel safe to do so and are able to place their trust in the justice system. Speaking up and reporting this kind of abuse should not negate or compromise their safety.
Additionally, image-based sexual assault could be criminalised under the crime of identity theft (identity fraud). Looking to the future, the case law might have to evolve in a way so that it incorporates a crime combining image-based sexual assault as well as identity theft. Such a crime would also require an appropriate sanction. For instance, in the USA, some states penalise revenge porn through fines and imprisonment. This sanction may be cumulated with other crimes, if applicable, such as blackmail and cybercrimes too. Civil remedies may be available for the victim such as damages including representation fees and other associated legal costs. As such, the UK could borrow the USA’s approach on sanctions as a starting point and develop their own reasoning and appropriate sanctions from there.
The Online Safety Bill may be an appropriate tool to modernize the law on deepfake pornography. However, the case law will likely be required to step in to provide the most up-to-date interpretations of the law on the topic, apply the law to emerging technologies, and guide the courts as to sentencing to protect victims and punish perpetrators.
If you have been deepfaked
Kate Isaac’s #NotYourPorn campaign advises the following if someone falls victim of deepfake: that they collect evidence, report the account that is responsible for the deepfake, contact the police and reach out for support.
 Definition of ‘deepfake’, Oxford Languages.  BBC News, Deepfaked: ‘They put my face on a porn video’ < https://www.bbc.com/news/uk-62821117> Accessed on 26th November 2022.  #NotYourPorn <https://notyourporn.com/> Accessed on 5th December 2022.  BBC News, ‘Pornhub removes all user-uploaded videos amid legality row’, <https://www.bbc.com/news/technology-55304115> Accessed on 5th December 2022.  Oxford Languages, Definition of ‘doxing’, < https://languages.oup.com/google-dictionary-en/> Accessed on 5th January 2023.  BBC News, Deepfaked: ‘They put my face on a porn video’, <https://www.bbc.com/news/uk-62821117> Accessed on 26th November 2022.  Criminal Justice and Courts Act 2015, Section 33 ; Image-based sexual abuse <https://www.victimsupport.org.uk/crime-info/types-crime/cyber-crime/image-based-sexual-abuse/#:~:text=Image%2Dbased%20sexual%20abuse%20(sometimes,causing%20them%20distress%20or%20harm> Accessed on 5th December 2022.  LAW CITATION  Alan Collins, Hugh James Solicitors.  Kate Isaacs and #NotYourPorn on Instagram <https://www.instagram.com/p/ClYPslfogAY/> Accessed on 5th December 2022.  BBC News, Sharing pornographic deepfakes to be illegal in England and Wales, <https://www.bbc.com/news/technology-63669711> Accessed on 5th December 2022.  PGMag, Definition of ‘anonymity application’, < https://www.pcmag.com/encyclopedia/term/anonymity-application#:~:text=Software%20that%20keeps%20the%20identity,may%20also%20include%20encryption%20capabilities.> Accessed on 5th January 2023.  BBC News n10.  Criminal Defense Lawyer, Revenge Porn: Laws & Penalties, <https://www.criminaldefenselawyer.com/resources/revenge-porn-laws-penalties.htm> Accessed on 5th January 2023.  #NotYourPorn, <https://notyourporn.com/> Accessed on 5th December 2022.  firstname.lastname@example.org or similar organisations.
[CJ1]I think more can be added here as to why it’s difficult to prove that a particular perpetrator created the deepfake. Are police equipped to find out who actually created the image? Can they determine who initially distributed it? Are both of these things equally punishable?