By Anadolu Agency
July 6, 2023 12:58 pmISTANBUL
Antivirus company, Eset, said Thursday that criminals transform regular pictures into nudes using ready-made AI tools and blackmail the owners of the photographs.
With easier artificial intelligence-based deepfake technology, which can create fake nude photos, visuals shared on social media can turn into blackmail tools.
The FBI warned for an increasing number of deepfake cases — visuals featuring adults and minors are circulated on social media and adult websites, Eset said in a statement.
Deepfake technology uses neural networks that allow it to imitate someone’s appearance or voice. Visuals are compressed with an encoder and then regenerated by a decoder, it said.
It noted that criminals can emulate similar facial movements by indistinctly placing a person’s face on someone else’s body.
Accessing technology is easier than ever for everyone, and novice users can use it convincingly and that should worry everyone, according to Eset.
It said it is too difficult to remove fake content from online platforms.
The firm suggested being more careful sharing visuals on the internet, especially for children, and to safety ensure social media accounts.
We use cookies on our website to give you a better experience, improve performance, and for analytics. For more information, please see our Cookie Policy By clicking “Accept” you agree to our use of cookies.
Read More