© 2024
Virginia's Public Radio
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Efforts to Display Truth in an Era of "Fake News"

Matt Rourke / AP Photo, FIle

 

Since the election, there’s been lots of talk about fake news – damaging stories made up and spread online.  There’s no easy way to counteract these bogus tales, but as Sandy Hausman reports, the University of Virginia is exploring options.

  

Mudslinging is an American tradition.  It’s always been part of our political landscape, but UVA Professor of Media Studies Siva Vaidhyanathan says there’s something new in the 21st century – an online platform where more than 40-percent of Americans get their news.

“When you look at your Facebook page, you see a column right down the middle -- whether it is a video of a puppy, or a story about global warming, or a completely made up story about a candidate for elective office, it’s all presented in the same format and the same tenor," he says. 

And it turns out the things we share, stories and images seen by millions of people, are those that spark emotion.

“You see an exponential increase in attention paid to babies, puppies, funny memes and really destructive nonsense.”

Well reasoned and documented stories don’t travel nearly as much. And what’s more, Facebook gives people stories that tend to narrow their world view.

“You start the clicks and the likes and the smiley faces and the comments," Vaidhyanathan says. "Facebook learns all about you and gives you more of what you have engaged with, and so over time you are less likely to encounter the deep, nuanced, well thought-out piece that challenges your point of view. The very core ethic of Facebook is that people should get more of what they like, even if what they like is junk food – even if what they like is poison.”

The very core ethic of Facebook is that people should get more of what they like, even if what they like is junk food, even if what they like is poison.

Facebook is experimenting with a grassroots approach in which users report suspicious content.  If enough people do that, then the questionable stories are reviewed by fact checkers who may mark them as not certifiably true.  

Vaidhyanathan doesn’t think that will fix the problem, and he says it’s much worse in other parts of the world, like the Philippines and Burma.

 “3,000 people killed in the Phillipines by (President) Duterte and his thugs, who are all using Facebook to spread their message and distort the truth of what’s going on in the Philippines. Just as many Muslims in Burma have been either displaced or killed by the majority Buddhist population who are using their very first engagements with Facebook – in fact in Burma the word Facebook and Internet are synonyms.”

He believes we will, eventually, find a way to keep Facebook content honest, but these are early days.

“We’re all just babies trying to figure out the norms and expectations and behaviors and laws that should govern our digital information systems.  The problem is we’ve got a bunch of babies driving trucks now," he says. "What was supposed to make us smarter has made us collectively incompetent. We seem less prepared to take on the challenges of our Earth and our society than we were 25 to 30 years ago.  We were supposed to be more united, more human and more humane, because of the fact that we could connect easily with people in Jakarta or Rio de Janeiro.  That hasn’t happened either!”

To promote thoughtful solutions, the University of Virginia will host a conference next month on Facebook and the News, and next year Professor Siva Vaidhyanathan hopes to publish a book on the subject.  He calls it UnSocial Media.

Sandy Hausman is Radio IQ's Charlottesville Bureau Chief