Think deepfakes don’t fool you? Sorry, you’re wrong
In early March, a manipulated video of Ukrainian President Volodymyr Zelenskyy was circulated. In it, a digitally generated Zelenskyy told the Ukrainian national army to surrender. The video was circulated online but was quickly debunked as a deepfake — a hyper-realistic yet fake and manipulated video produced using artificial intelligence. While Russian disinformation seems to be having a limited impact, this alarming example illustrated the potential consequences of deepfakes. However, deepfakes are being used successfully in assistive technology. For instance, people who suffer from Parkinson’s disease can use voice cloning to communicate. Deepfakes are used in education: Ireland-based speech synthesis…
This story continues at The Next Web
Discussion ¬