Mindful

In the past year, viral fake news and filter bubbles have wreaked havoc, precipitating real-world events like #Brexit, #Pizzagate, and #AlternativeFacts.

Economists with the World Economic Forum warned about this trend in their 2013 report on risks to the global economy. Viral rumors, they cautioned, could have devastating impacts as they spread across social media.

Now, America is so divided that liberals and conservatives won’t agree on basic facts. One of the only things they will agree on is how much they disdain hearing each other’s point of view. These polarized views compel social media users—both liberal and conservative—to help spread fake news by ignoring facts.

To reunite the country, Americans must reestablish a culture of honesty. And in a world where half of Americans get news from social media, that culture will begin online—one post and comment at a time.

Our Social Responsibility on Social Media

Social media may be one of society’s greatest tools, but it’s also one of our great vulnerabilities. Dan Zigmond, a Zen priest and director of analytics for the Facebook news feed, puts it this way: “By breaking down barriers and making communication easier, we also open up the possibility of making miscommunication easier. It means it’s even more important to exercise mindfulness in the way we relate to each other.”

Zigmond recently helped Facebook develop a suite of new tools that allow users to report fake news on the platform. The initiative, launched late last year, means Facebook sends stories reported by users to third-party fact-checking websites such as Snopes, ABC News, and The Associated Press. If those organizations identify the article as fake, it will be flagged as “disputed” and will be linked to an article explaining why, according to Facebook.

Addressing Hoaxes and Fake News from Facebook on Vimeo.

Facebook users can still share articles flagged as fake—there’s no guarantee against another Pizzagate—but they’ll receive a notification before sharing the article telling them that “independent fact-checkers have disputed its accuracy.”

Ultimately, when it comes to the impact of fake news and the kind of content that reigns on our social feeds, we’re the ones with our hands on the lever. Social media may help spread disinformation—users may dig in their heels when challenged on a false claim, ignore facts, and try to reaffirm their own beliefs—but it’s also easy to use social media to fight disinformation. Here are five ways you can bring more awareness into your social media experience, harness online tools to help defeat trolls while fact-checking sources, and foster better conversations and relationships online:

5 Ways to Make The Internet Honest Again

1) Find Your Blindspots

You likely feel that you’re not susceptible to misinformation, which makes sense—misinformation exists because those who believe it don’t think it’s misinformation. Try a few of these tasks to check your biases and blindspots.

Disagree with yourself

We often ignore information that makes us uncomfortable. This is called “information avoidance.” As decision scientist Russell Golman explains in a paper, we tend to undervalue the validity of information in disagreement with our own worldview, while overvaluing information that affirms our worldview. As a result, when conservatives and liberals are given the exact same set of objective facts about a polarizing issue (like climate change) each side may become more entrenched in their beliefs. Intelligent, creative, and scientifically-literate people are actually worse at accepting challenging information because they’re more able to rationalize previously held beliefs.

We’re quite good at ignoring, forgetting, or never learning inconvenient information. This is where social media echo chambers, political polarization, science denial, and media bias come from.

Golman explains that we’re quite good at ignoring, forgetting, or never learning inconvenient information. This is where social media echo chambers, political polarization, science denial, and media bias come from. 83% of social media users say that when they see something they disagree with on social media, they ignore it. 39% of users have blocked or hidden friends or posts because they’re political—and liberals block friends more often.

“I would encourage people not to unfriend or unfollow people because they disagree with them,” suggests Zigmond. “We all have beliefs that feel like part of our identity. It’s comforting to be told that those beliefs are right, and it’s discomforting to have those beliefs challenged. It’s very natural. We just need to be aware of that phenomenon and be conscious of how we react. I would encourage people to try to stay connected and keep a dialogue going. Take the next step and read and inform yourself about what’s being shared.”

Reveal your bubble

If you use Chrome, try installing PolitEcho. It will analyze your newsfeed to create a chart of your friends, organized by political affiliation and how often they appear in your news feed. The results are sobering, often revealing how a small cluster of politically-homogeneous acquaintances make up most of what you see on Facebook.

Vivian Mo, who helped develop PolitEcho, notes that: “Ironically, we think of ourselves as open-minded and diverse, only to find that we’ve surrounded ourselves with people who have the same political leanings.”

According to researchers at Facebook, the composition of our friend network is the single-most important factor in determining what we see in our newsfeeds. About 20% of liberals’ and conservatives’ friends on social media are from the opposite end of the political spectrum though liberals tend to have fewer friends with contrasting viewpoints than conservatives. Despite the fact that America has a very diverse and open internet, we choose not to expose ourselves to unsettling viewpoints.

What effect does this have? Take a look at Washington Post’s project, Blue Feed Red Feed, which displays simulated Republican and Democrat newsfeeds next to each other. The two newsfeeds are like different realities.

In the past few months, commentators have argued that journalists are disconnected from conservative America, which is why newspapers largely failed to predict the 2016 election. Analysts at MIT confirmed that journalists and Republican voters are almost completely segregated on Twitter, which likely contributes to a lack of mutual understanding.

The easiest way to check your bias is to notice what articles you ignore and which friends you choose to block, and then actually read some of the frustrating articles in your newsfeed with an open mind.

Pop your bubble

A “filter bubble” is a sort of online echo chamber, where—thanks to your choice of friends, information sources, and political preferences—you only see information that affirms your worldview. There are lots of great tools to help you get outside of your filter bubble.

Tools to pop your filter bubble:

  • Hi From the Other Side will introduce you to someone who voted for the candidate you voted against in the federal election. Simply fill out a questionnaire and the organization will pair you with someone with whom you can “engage in civil conversation.” The goal, says the website, is “not to convince, but to understand.”
  • Escape Your Bubble will also help you understand the other side. It’s a Chrome plugin that injects informative articles into your Facebook feed, meant to highlight issues you might not think about.
  • The Echo Chamber Club handpicks articles to challenge liberal viewpoints, and sends them out in an email newsletter.
  • Allsides is a news organization that views all journalists as inherently biased. “In journalism school, they teach you how to report in an unbiased manner, and some journalists do a pretty good job of that,” CEO John Gable told Forbes shortly after the website launched in 2012. “But frankly, we think that’s bullshit. We don’t think it’s possible to be unbiased.” Allsides gathers contrasting viewpoints on the news of the day, spanning the political spectrum.

2) Flag Fake News

Are you under the impression that British Prime Minister David Cameron once did something really gross with a pig? Do you think police used Facebook to target Standing Rock protesters? Do you remember when the Pope endorsed Donald Trump? Did you hear about how the owner of Corona left millions of dollars to residents of the town where he grew up, in his will? None of these things happened—though all of them were widely reported on social media.

During the election, the 20 top fake news stories were more popular than major outlets’ top 20 real news stories.

Activists, organizations, and governments create viral fake news to make money and influence public opinion. And because half of the stories shared on Facebook aren’t even read before they’re posted, rumors and lies can spread like wildfire. All it takes is a captivating photo and headline.

During the election, the 20 top fake news stories were more popular than major outlets’ top 20 real news stories.

The editors of that 2013 World Economic Forum report on the risks of viral false rumors called for greater media literacy and new technology to fight fake news. Four years later, we’re finally starting to respond.

Up your fact-check game

Google and Facebook recently added new features: Google started incorporating a fact-check tag into some of its news pages, and after repeatedly promoting fake news to its “Trending” sidebar, Facebook announced that the sidebar will no longer be personalized. Instead, it will show the same stories to everyone—based largely on what news outlets are reporting on, rather than what users are posting about—which they hope will make it more resistant to fake news.

While Facebook’s new tools to flag fake news are starting to show results, and Zigmond hopes these tools make it easier for people to be mindful and informed, they are no guarantee that fake stories aren’t circulating. Here are three quick ways you can spot a fake article from factcheck.org:

  • Check the author: That story about the Pope endorsing Donald Trump? No byline at all. Looking for a story’s author can be the first piece of the puzzle: what’s their qualification? Have they won any awards for their journalistic work? Can you click through to their biography and any other pieces that they’ve written? It could turn up pieces on other bogus sites.
  • Check the date: Sometimes fake news stories aren’t actually fake—people have just shared them years later, claiming they are related to current events. “Today’s worst snowstorm on record” could well be from the previous decade.
  • What’s the support? Fake articles may cite official—or “official-sounding”—sources, but once you investigate the source, you may find it doesn’t actually back up the claim. It’s always a good idea to check links in articles to see if the sources used as supporting evidence actually support the topic.

Upload tools that find fake news

We’ve all seen friends and family members post questionable stories. There are lots of tools that can help you debunk them when you see them.

Slate’s new Chrome plugin, This Is Fake, will label fake news stories in your Facebook feed and then suggest an article to debunk it, so that you can add a clarifying comment to the misleading post with a link to the real story. Fake News Alert, another plugin, will notify you when you arrive at a website that is known to publish false or sensational stories. The Washington Post has created a plugin that fact-checks President Trump’s tweets as they appear in your Twitter timeline, providing clarification and needed context.

If you’re not sure about a story, there are lots of websites that will help you assess its truth. Check the story on reputable fact-checking sites like PolitiFact, Factcheck.org, or Snopes.

Refine your “truthiness” meter

Science writer Emily Willingham created a methodology for analyzing the truthfulness of news stories. She explains how to examine every article by checking the URL:is it ABCnews.com or ABCnews.com.co? The second URL takes you to an official-ish looking fake news website, clearly hoping to catch users unawares. Willingham says it’s also important to consider the bias you bring to an article, as well as the the website’s bias. Even stories on prominent sites like the New York Times can point to inaccurate conclusions due to author bias, sensational headlines, or questionable sources.

Finding the truth in a post-truth media landscape, where any source can be biased and misleading, requires skill and skepticism. NPR created a 14 point “Finder’s Guide to Facts” that offers some simple ways you can judge the value of a news story. It ends with a powerful reminder: “Learning the truth is not a goal, but a process.”

3) Integrate Rather than Polarize

As journalist Andres Miguel Rondon explains, based on his experience as an anti-populist activist in Venezuela, “Populism can only survive amid polarization. It works through caricature, through the unending vilification of a cartoonish enemy. Pro tip: you’re the enemy… Your organizing principle is simple: don’t feed polarization, disarm it.” Here’s how to put it into action:

Explore beliefs instead of facts

Evidence and data are extremely important, but they aren’t very effective at changing minds. To do that, you need to address a person’s beliefs and motivations. Liberals and conservatives tend to have different moral foundations. Liberals tend to place more weight on values of fairness and caring, while conservatives place more on loyalty, groups, and purity. In debates, we often assume our own moral foundations are universal and identify those who disagree as immoral. Combined with our tendency to avoid information that makes us feel uncomfortable, different moral frameworks can turn into a polarized narrative of us-versus-them.

So, instead of trying to win an argument, try reframing your points to align with your opponent’s framework. For example:

  • If you believe that healthcare should be more widely available, while someone you’re talking to wants to abolish Obamacare, try talking about the need to eradicate infectious diseases, which might align with their moral value of purity.
  • If you believe that military spending is essential for a strong country, but you’re talking to someone who wants to cut military spending, try emphasizing how military spending helps the economically disadvantaged find employment, which might align with their moral value of fairness.

As Whitney Phillips, who wrote a book on internet trolls, explains, instead of asserting that what someone believes is wrong, it’s helpful to ask, “in what ways is this true for you?” In order to have a constructive conversation, it can be important to understand, and sometimes challenge, cultural beliefs.

The Atlantic has a guide on how to use moral frameworks to have more constructive debates.

Discredit the discreditors

Even when we help someone expand their view on something, our work can quickly be undone when their original beliefs are reinforced by new evidence. In one study, researchers showed participants evidence of climate change and saw a sharp increase in the acceptance of climate change. When researchers then showed the participants false information that discredited climate science, the participants’ acceptance dropped all the way back down to the starting point.

But researchers learned that they could also “inoculate” participants to misinformation. Researchers showed other participants evidence supporting climate change and also gave them a disclaimer that politically-motivated groups try to convince the public that there is disagreement among scientists about climate change with misinformation. Then the researchers gave participants the misinformation—but, this time, it was far less effective at discrediting the real science.

4) Defeat the Trolls

In 2015, former Twitter CEO Dick Costelo said, “We suck at dealing with abuse and trolls on the platform and we’ve sucked at it for years.”

Trolls are internet users who deliberately provoke others with offensive posts and comments. Last summer, cohorts of trolls pushed black actor Leslie Jones off Twitter with a barrage of hate speech. Trolls can be individuals, computer programs, or organized groups with an agenda. They are extremely effective at shutting down public discourse and spreading misinformation. Here’s how we shut them down:

Support victims

You can support victims of trolling through Heartmob, a website that supports freedom of speech by fighting abuse and harassment. By signing up, you make yourself available to help victims of harassment by sending supportive messages and reporting abuse. If you’re being harassed, Heartmob can help you develop a safety plan and get support from its members.

Feed the trolls

New research also suggests that the popular internet adage “don’t feed the trolls” isn’t completely true. In some situations you can pacify trolls by engaging with them. Kevin Munger, a researcher at NYU, programmed Twitter bots to respond to white users who posted racial slurs with a pre-programmed message, “Hey man, just remember that there are real people who are hurt when you harass them with that kind of language.”

Sadly, Munger found that when bots appearing to be black men challenged racists tweets, racial slurs from the offending user subsequently increased. However, he also found that when bots who shared a social identity—like race—with the racist challenged them, their use of racial slurs dropped.

“Overall, I found that it is possible to cause people to use less harassing language,” wrote Munger. “If we remember that there’s a real person behind every online encounter and emphasize what we have in common rather than what divides us, we might be able to make the internet a better place.”

Munger’s research contributes a digital perspective to long-standing discussions of allyship— how people of privilege can use their privilege to fight prejudice in solidarity with members of marginalized communities.

5) Connect Mindfully on Social Media

Social media—like mindfulness practice—is a tool for creating human connection, according to Zigmond. “When people feel connected to others, that brings out their best selves,” he says. “When people feel disconnected, their worst impulses come out.”

For example, Zigmond recommends asking yourself three age-old questions to determine whether something is worth saying or sharing:

  • Is it true?
  • Is it necessary?
  • Is it kind?

The internet can help bring out our best selves, but only if we use it as a forum for authentic communication. Facebook and Google can introduce new tools to encourage such communication, but fundamentally, as Zigmond says, “It’s up to you what you want to see. That comes with responsibility.”

It’s helpful to bring mindfulness to social media in the same way we would to real-world conversation. Pay attention to your own thoughts and beliefs, and try to be constructive when you discuss them with others.

Correction: a previous version of this story stated Dan Zigmond was head of data science at Facebook. In fact, he is director of analytics for the Facebook news feed.

Before You Scroll, Try This Mindful Social Media Practice

A User’s Guide to Living Well in Screenworld

Sam Littlefair

Sam Littlefair is a features writer and the assistant editor of LionsRoar.com.

Comments

Comments are closed.