X
X
X
X

The new virtual threat: Voice cloning!

HomepageArticlesThe new virtual threat: Voice cloning!

The human voice can be used by copying it exactly with the voice cloning method. Copied audio turns into a dangerous weapon in the hands of scammers and cyber thugs.
In order to perform the voice cloning process, the recorded voice must be uploaded to the computer artificial intelligence software. After this stage, the software can say every word written with the keyboard with this sound.

According to the news in TRT Haber; Cybersecurity expert Dr. Soner Çelik said the following about voice cloning:

"First of all, it actually includes the 'deep fake' technology, whose history dates back to 1997. Machine learning, which we call 'deep learning'. Here, it is basically tried to obtain videos, images and sounds that are almost exactly like the real ones visually and audibly with fake videos. A new type of threat used by

Scammers target large companies with the voice cloning method. He wants money by calling like a manager or one of the partners of the company.

"International issues can be turned into crises"

Çelik said, "By imitating voice recordings, whether leaders or citizens, large masses can be mobilized and people can be influenced. Interstate and international issues can be turned into a crisis with imitation audio videos." said.

Risk in the art world

The art world is also one of the areas threatened by the sound cloning process. Many areas such as copyright, protection of personal data also need updates against voice cloning.

Makalemizle iİlgili Görüşlerini Belirtebilirsin


Bir Sorunuz mu Var?

Sorularınızı cevaplamayı bekleyen muhteşem bir ekibimiz var.
Bize Ulaşın.