
Why Are Deepfakes So Effective In Today’s Society?
A deepfake blurs the line between what is real and what is fake. An image, audio clip or video file might very well be doctored, with no actual way of knowing this. It can be a great tool sway the public opinion in your favor, if you’re the actor that is pulling the strings.
Manipulating the public opinion is as old as history, but the means by which this manipulation is executed has drastically changed in recent years. Most notably, the rise of deep learning tech has been popularized. And with that change, the effectiveness of the manipulation has shifted.
When the average Joe is no longer able to see the difference between fact and fiction, a new world of wonder is opened. A modern day Pandora’s box, some would say. It’s a world in which few people with some tech skills can change the minds of many, if they wish to do so. It is exactly this threat that forces us to think about the immense consequences AI-generated deepfakes can have on us.
If you’d be the person that could make anyone say anything, what would you choose to do with it?
Deepfakes In Today’s Society
With the rise of deep learning technology, the creation of deepfake images, audio, or video with the help of a self-learning Artificial Intelligence (AI) is within reach of all people. Seemingly realistic but falsified media content could quite literally be created by any person with an agenda. And there’s not much we can do about it.
It’s absolutely true that detection technology is playing a game of catch up with the rapid developments, but it’s a digital war that simply cannot be won. Detection software is always going to lag one step behind the current tech.
With that realization in mind, we must brace our societies for the inevitable consequences. Teach our children to doubt and question the world around them. And above all, check our own facts as much as humanly possible.
Once we fall for an inevitable confirmation bias trap, the already large societal division will only become bigger and bigger over time. Why believe what actually happened, when you can simply listen to that news channel with ‘altered facts’? This is not science fiction, it is already a reality in significant portions of the American and international news media broadcasts.
Confirmation Bias As A Polarization Catalyst
Human brains work in a peculiar way – we tend to seek what we already know to be true, regardless of fact. Regardless of science. Regardless of proper research.
This phenomenon is also known as the ‘confirmation bias’. Once we have a belief, we tend to cling to it, even when it’s untrue:
- People tend to surround themselves with messages that confirm pre-existing opinions
- People tend to ignore messages that disprove or don’t fit our pre-existing opinions
Despite our best efforts to remain truthful, humans are inherently flawed creatures.
And for a good reason, too. Back when we were hunter-gatherers, living in small tribes of a few dozen people was a great survival strategy. It’s where our tribe-like thought processes stem from: we want to be part of a group. Agreeing with your own group increases your odds of survival.
While we obviously are over-simplifying things, the idea is commonly found in psychological science. it would make the confirmation bias nothing more than a psychological survival mechanism.
Our brains haven’t changed much since those tribal times. Humans still (unconsciously) prefer to identify themselves with the set of ideas that are supported by ‘their’ group of people. Deepfakes can be tools to reaffirm these beliefs. If someone inevitably decides to empower their statement with some falsified media content that nobody knows is false, the truth would no longer be a factor of relevance. A disturbing consequence of our tribal brains clashing with hyper-realistic deepfake tech.
Tribal Brains vs. Deepfakes
It is exactly this blurred line between what is true and what is false, which is likely to further divide societies. Deepfake technology, in this way, could very well become a catalyst for further polarization between groups of people. It doesn’t really matter if those groups are formed based on political beliefs, personal interests, work-affiliation, sexual orientation, race, gender, or other factors.
Where there is room for dividing one group of people into two (or more) groups, deepfake-altered media can be used to confirm bias and re-affirm our pre-existing beliefs. Our tribal brains force us to stick with what you believe is true, rather than use actual facts to change our opinions.
Spreading Doubt And Propaganda
Most people harbor some type of false belief, whether we know about it or not. And it doesn’t have to do much with intelligence, either. Professors, doctors and politicians all do it, too. So there’s no need to feel bad about it, it’s engrained into who we are as human beings.
Given enough time, someone, somewhere will inevitably use ‘alternative facts’ to support their agenda. And they’ll be likely to use deepfake technology to ‘prove’ their statements. It’s their way of spreading doubt and propaganda within societies. And don’t say this cannot infiltrate societies on the most important political levels, either, since most American political statements these days are proven to be ‘mostly false’ or ‘false’ entirely.
Deepfakes As A Tool To Alter Opinion
It is abundantly clear that the threat of deepfakes being used for ‘the bad stuff’ is pretty much inevitable. With the technology improving and becoming harder and harder to spot as it improves over time, the race for the truth is about to be lost.
And there’s not much we can do about it either, which is the tough part in this all. The fact that deepfake tech is outpacing human ‘deepfake detection’ software, means that we as a society have the burden of dealing with the consequences.
Exactly how bad it will eventually turn out to be is very difficult to predict. But scientists and professors in the deep learning field aren’t optimistic about where the situation is heading towards. Not even in the slightest, actually. So we have to look beyond. Mitigation of the consequences and engraining critical thinking skills into the minds of the younger generation seem to be the only option to brace our future generations for what is to come.
As for here and now, there is still a brief period of time where we can breathe. Deepfakes are here, but they are still detectable and facts are still being checked by professional institutions. As long as our democracy works and we keep checking each other’s beliefs, things will be okay. For now.