President Trump is a pathological liar who denies saying most of what he said. Facts he doesn’t like get distorted. Truths get twisted. Anything he doesn’t agree with gets labeled as fake news. One minute we’re going back to the Moon. The next week it’s Mars, of which the Moon is a part. It’s literally impossible to follow the paper trail for his thought process. Which is exactly the whole point. Misinformation has become his weapon of choice. And as scary as that is that’s just the beginning of the dystopian nightmare we’re about to live through. For we’re getting to the point where technology will soon make it easy to change what people say and even how they say it.
If you thought that fake news problem that heavily influenced the last election was bad then I’ve got some really bad news for you. Blog posts and memes that tout misinformation and spread lies like wildfire will soon seem quaint. We’re about to open a whole new can of worm thanks to our new found ability to create extremely realistic deep fake videos, so called AI generated content that would be indistinguishable from the real thing. A trend that started with turning porn scenes into fake celebrity sex tapes and has now continued on to petrifying proof of concept videos that show politicians saying things they never said. Most notably with recent high profile incidents involving Nancy Pelosi and Mark Zuckerberg. If the 2016 election was a high-tech battlefield, the 2020 election is going to be a war zone.
It’s not just the politicians that have to worry though. In true dystopian fashion the larger population may soon grow fearful of this propaganda generating tactic as well. For politicians having to deny an altered video is one thing. Politicians themselves using this mistrust of videography to claim that legitimate videos that oppose their regime are actually deep fakes is quite another. Would we get to the point where mistrust rues the day? Where video evidence no longer becomes admissible in a court of law? Where powerful viral videos exposing human rights violations lose their luster? Where social media networks essentially lose their entire effectiveness since no one can ever believe anything that they see or hear?
Not all news on this front is grim. Thanks to additional advances in artificial intelligence it’ll also soon be possible to alter how we say things. In fact, we may soon be able to speak a foreign language as naturally as a native born speaker. And we’ll have Google’s Translatotron to thank. A skill, that unlike deep fakes, may actually be useful.
According to Futurism, “It’s not the first system to translate speech from one language to another, but Google designed Translatotron to do something other systems can’t: retain the original speaker’s voice in the translated audio.
In other words, the tech could make it sound like you’re speaking a language you don’t know — a remarkable step forward on the path to breaking down the global language barrier.”
On the surface that sounds good. I’m having trouble try to imagine a nightmare scenario involving rogue language barrier tech. But you never know how things will play out. Deep fakes may ultimately wind up benefiting humanity if they let people create works of art that they never could have imagined before. Meanwhile, Translatotron could lead to unforeseen consequences as people take the easy way out and no longer try to learn how to speak a foreign language on their own. Something that could lead to languages getting lost and societies collapsing if we were to become too reliant on a technology that were to suddenly fail.
This contrast between deep fakes and Translatotron highlights the inherent risk involved with the development of any new technology. Altruistic technologies may wind up getting perverted by nefarious actors. Weapons might wind up getting coopted for the greater good. When you invent something you really never know what will become of your invention. It’s up to all of us to make sure that we put the necessary checks and balances in place to ensure that the dystopia we fear never comes to pass. That we take the necessary steps to ensure that the plot of a Black Mirror episode never comes to fruition in real life.
Will trust continue to be an issue going forward?
Leave a comment