This article is part of the On Tech newsletter. You can Login here to get it on weekdays.
Why do we do this when we are involved in heated arguments with our neighbors on Facebook or in politically charged YouTube videos? That is the question my colleague asked Cade Metz wants us to ask ourselves and the companies behind our favorite apps.
Cades Last item is about Caolan Robertson, a filmmaker who for more than two years helped create videos featuring right-wing YouTube personalities that he says were intentionally provocative and confrontational – and often deceptively edited.
Cade’s coverage is an opportunity to ask us tough questions: Do the rewards of internet exposure encourage people to post the most dangerous material? How much should we trust what we see online? And do we tend to look for ideas that fuel our anger?
Shira: How much blame does YouTube earn for people like Robertson who make videos that emphasize conflict and social divisions – and in some cases have been manipulated?
Cade: It’s tough. In many cases, these videos became popular because they confirmed some people’s prejudices against immigrants or Muslims.
But Caolan and the YouTube personalities he worked with also learned how to play or make up conflict. You could see these types of videos caught her eye on YouTube and other websites. And YouTube automated recommendations also sent lots of people to these videos and encouraged Caolan to do more of them.
One of the executives at Facebook recently wroteIn part, that his company is mostly not to blame for pushing people into provocative and polarizing material. That’s exactly what people want. What do you think?
There are all sorts of things that add to our propensity for sensational or outrageous things, including talk radio, cable television, and social media. But it’s irresponsible to say that some people are like that. We all have a role to play not to incite the worst in human nature, and that includes the companies that are behind the apps and websites we spend our time on.
I’ve thought about it a lot my coverage of artificial intelligence technologies. People try to distinguish what people do and what computers do as if they were completely separate. You are not. People decide what computers doand people use computers in ways that change the way they work. That’s one reason I wanted to write about Caolan. He takes us behind the curtain to see the forces – both human nature and engineering design – that affect what we do and how we think.
What should we do about it?
I think the most important thing is to think about what we are really seeing and doing online. Where I get scared, I think about new technologies, including Deepfakes that will be able to generate counterfeit, misleading, or outrageous material on a much larger scale than people like Caolan ever could. It will be soon harder to know what’s real and what not.
Isn’t it also dangerous if we learn to distrust what we see?
Yes. Some people in technology believe that the real risk with deepfakes is that people learn not to believe anything – including what is real.
How does Robertson feel about making YouTube videos that he believes are polarized and misled people?
In a way, he regrets what he’s done, or at least wants to distance himself from it. But he’s now essentially using the tactics he used to create far right videos to create extremely left-wing videos. He does the same political side as he does on the other.