{"id":32645,"date":"2019-06-28T14:50:39","date_gmt":"2019-06-28T09:20:39","guid":{"rendered":"https:\/\/yaabot.com\/?p=32645"},"modified":"2024-02-01T12:58:45","modified_gmt":"2024-02-01T07:28:45","slug":"deepfakes-are-growing-the-true-beginning-of-the-post-truth-era","status":"publish","type":"post","link":"https:\/\/entropymag.co\/deepfakes-are-growing-the-true-beginning-of-the-post-truth-era\/","title":{"rendered":"Deepfakes are Growing: The New Beginning of the Post-Truth Era"},"content":{"rendered":"\n
USA\u2019s House Speaker Nancy Pelosi\u2019s doctored video, or Barack Obama\u2019s stand up where he speaks comedian Jordan Peele\u2019s words, and a talking Mona Lisa. Deepfakes are making their way into popular culture. An AI tool that uses the generative adversarial networks to build very convincing videos from scratch, deepfakes is a fancy toy, and yet fairly accessible to everyone. You too could use it to combine or superimpose any number of images and videos onto a source image or video. <\/p>\n\n\n\n
Here\u2019s how deep faking works – first, a machine learning model is trained with a particular data set. Then, it creates video forgeries and detects forged videos. Forged videos are created and fed into the model until it fails to detect the forgery. Creating deepfakes was once the bailiwick of sci-fi films and propaganda producing agencies but now one only needs an app or a free to download software for it. <\/p>\n\n\n\n
The key here is the data set, the larger it is, the easier the creation of deepfakes. Since most believe what they see, deep fakes are hacking this human tendency, to spread fake news and disinformation. They can be easily used to manipulate and discredit the target, affect global policy outlook, undermine elections or throw a country into crisis. Consider, for example, the video from White House that surfaced in November last year. Apparently, just a tad of editing was successful in showing that a female White House intern was attacked by Acosta (a CNN reporter) while attempting to take the microphone from him. <\/p>\n\n\n\n
Remember George Orwell\u2019s 1984 where Winston Smith rewrites news and adjusts (read incinerates) historical records to fit the goals of the state? With everything from minimising errors in shapes, light and shadows, to changing the angles of facial features and the softness and weight of clothing and hair, it can all be done with basic technical know-how. <\/p>\n\n\n\n
At a time when deepfakes hew so close to reality, they act as the Gospel truth for partisans (who barely agree on facts). Deepfakes let the audience rely on them to an unprecedented degree while sparing them the need to trust anything else. Most media platforms available today compress the videos into smaller formats that makes them quicker to upload and easier to share. This lossy compression can further remove critical clues that could assist in detecting fakes. The problem is that right now, there isn’t much funding for tools that could work in detecting deep fakes, but there sure is immense potential in creating them.<\/p>\n\n\n\n