Person wearing yellow jumper holds phone showing Telegram app.


Imagine that you upload a photograph of yourself on holiday to your favorite social media platform. You are dressed in a swimsuit and you are smiling at the camera. Now imagine later coming across this image while scrolling through your newsfeed. You recognize your face and the background and it looks like your photo, but in this image, you are completely naked. Therelaw is some inconsistencies – you do not recognize the body in the image – but it is convincing nonetheless.

This might sound like a scene from a Black Mirror episode but is, in fact, a real possibility thanks to tools available on the social media app Telegram, which allows users to upload innocent images of a (clothed) person, and request that the person in the image is “digitally undressed” for a fee. Telegram has more than 400 million active monthly users.

While Telegram operates predominantly as a messaging app, it facilitates autonomous programs (referred to as “bots”), one of which is able to digitally synthesize these deepfake naked images.

Deepfake detection company Sensity recently published research into Telegram. They found that 70% of Telegram users use its deepfake bot to target women and that, as of the end of July 2020, at least 104,852 fake nude images had been shared in an “image collections” channel available on the app. The number of user-requested images that have been publicly shared is likely to be much higher. The ease with which such “image manipulation” may be carried out without the knowledge of its victims is alarming.

So: is the use of deepfake bots to produce pseudo naked images legal?

Underage pictures

The Telegram bot has been linked to reports of images that appear to be of underage girls. In this case – if the person in the image is underage – the legal position is clear. Images of real children which are altered to appear nude or sexually explicit are internationally unlawful. The Convention on the Rights of the Child, ratified by 196 countries, requires parties to the convention to take steps to protect children from being sexually exploited and being used in the production of pornographic material.

As long as Telegram removes reported indecent images of children, Telegram is not culpable under current international legal frameworks if a user uses the deepfake bot to produce an indecent image of a child. But it is doubtful that this law makes the bot itself unlawful.

In the UK, international obligations to protect children from sexual exploitation are bolstered by laws prohibiting the production of sexual pseudo-imagery, such as photoshopped images of a young person appearing naked. The Protection of Children Act (1978) prohibits the creation and distribution of such an image, and Section 160 of the Criminal Justice Act (1988) also makes it an offense for a person to have a pseudo-image portraying an indecent image of a child in their possession.

What about adults?

For women and men over the age of 18, the production of a sexual pseudo-image of a person is not in itself illegal under international law or in the UK, even if it is produced and distributed without the consent of the person portrayed in the image.

This is, as usual, a case of the law playing catch-up. International laws created to protect privacy do not necessarily protect people from this type of abuse. Article 8 of the European Convention on Human Rights, which provides a right to respect for a person’s “private and family life, home, and correspondence,” has been used as the basis for domestic laws throughout the UK and Europe to protect photographs, but only if the original image remains unaltered.