The controversy is taking place on this app. Actually, photographs of girls are being manipulated through this app. This is being done with a tool in this app.
More than ten thousand girls targeted
Telegram’s DeepFac tool can remove the clothes of a person wearing a photograph. Using this tool wrongly, minor girls are being harassed and their photos are being tampered with. It is being told that till now more than one lakh indecent pictures of more than ten thousand girls and women have been shared online without consent.
Also read — Google Removes 3 Apps For Kids From Play Store Over Data Collection – steals data
Use of AI Bot
According to reports, a new Artificial Intelligence Bot (AI Bot) has been used by Telegram Network to make indecent photographs of girls and women. This bot has been working as an app for the past nearly a year. This bot allows users to make pornographic images of girls.
Also made videos of celebs
Using this tool, not only ordinary girls and women, but celebs are also being targeted. It is being told that there are more than 1 lakh fake pictures of girls on Telegram. 70 percent of these photos were acquired from social media or private sources. Most of the photographs that have been shared are of ordinary citizens. At the same time, dirty videos of celebrities were also made using this technique a few days ago.
Intelligence company revealed
The fake pornographic images that are going viral need only a normal image. Further work is done with software. The case was revealed by Visual Threat Intelligence Company Sensei. The CEO of this company, Giorgio Patrini, said that this bot can create nude images from just one picture. This is why common people are easily targeted. Nude photos can also be made from Facebook profile pictures.