Do You Think Online Conspiracy Theories Can Be Dangerous?

Do You Think Online Conspiracy Theories Can Be Dangerous?

Find all our Student Opinion questions here.

Do you believe in any conspiracy theories? Have you seen conspiracy theories featured in videos, memes, podcasts or talk shows? How do you know when to quickly dismiss a theory as unfounded or absurd — and when to consider if there’s any truth behind it?

Jack Nicas writes about one place on the internet where conspiracy theories have racked up millions of views: YouTube. Perhaps you’ve seen some of these videos. In “Can YouTube Quiet Its Conspiracy Theorists?” Mr. Nicas reports on YouTube’s efforts to minimize the spread of conspiracy theories on its site:

For years it has been a highly effective megaphone for conspiracy theorists, and YouTube, owned and run by Google, has admitted as much. In January 2019, YouTube said it would limit the spread of videos “that could misinform users in harmful ways.”

One year later, YouTube recommends conspiracy theories far less than before. But its progress has been uneven and it continues to advance certain types of fabrications, according to a new study from researchers at University of California, Berkeley.

YouTube’s efforts to curb conspiracy theories pose a major test of Silicon Valley’s ability to combat misinformation, particularly ahead of this year’s elections. The study, which examined eight million recommendations over 15 months, provides one of the clearest pictures yet of that fight, and the mixed findings show how challenging the issue remains for tech companies like Google, Facebook and Twitter.

The researchers found that YouTube has nearly eradicated some conspiracy theories from its recommendations, including claims that the earth is flat and that the U.S. government carried out the Sept. 11 terrorist attacks, two falsehoods the company identified as targets last year. In June, YouTube said the amount of time people spent watching such videos from its recommendations had dropped by 50 percent.

Yet the Berkeley researchers found that just after YouTube announced that success, its recommendations of conspiracy theories jumped back up and then fluctuated over the next several months.

The data also showed that other falsehoods continued to flourish in YouTube’s recommendations, like claims that aliens created the pyramids, that the government is hiding secret technologies and that climate change is a lie.

The researchers argue those findings suggest that YouTube has decided which types of misinformation it wants to root out and which types it is willing to allow. “It is a technological problem, but it is really at the end of the day also a policy problem,” said Hany Farid, a computer science professor at the University of California, Berkeley, and co-author of the study.

The article continues:

Farshad Shadloo, a YouTube spokesman, said the company’s recommendations aimed to steer people toward authoritative videos that leave them satisfied. He said the company was continually improving the algorithm that generates the recommendations. “Over the past year alone, we’ve launched over 30 different changes to reduce recommendations of borderline content and harmful misinformation, including climate change misinformation and other types of conspiracy videos,” he said. “Thanks to this change, watchtime this type of content gets from recommendations has dropped by over 70 percent in the U.S.”

YouTube’s powerful recommendation algorithm, which pushes its two billion monthly users to videos it thinks they will watch, has fueled the platform’s ascent to become the new TV for many across the world. The company has said its recommendations drive over 70 percent of the more than one billion hours people spend watching videos each day, making the software that picks the recommendations among the world’s most influential algorithms.

YouTube’s success has come with a dark side. Research has shown that the site’s recommendations have systematically amplified divisive, sensationalist and clearly false videos. Other algorithms meant to capture people’s attention in order to show them more ads, like Facebook’s newsfeed, have had the same problem.

The stakes are high. YouTube faces an onslaught of misinformation and unsavory content uploaded daily. The F.B.I. recently identified the spread of fringe conspiracy theories as a domestic terror threat.

Students, read the entire article, then tell us:

  • Do you watch a lot of YouTube videos? Have you ever watched videos that discuss conspiracy theories? Do you believe any of these videos? How do you know if something on YouTube, or another social media platform, is rooted in fact or is simply fabricated?

  • How concerned are you about people believing conspiracy theories? Is it something that you, your family or friends take seriously? Do you think conspiracy theories are just fun and interesting, or do you think they can present serious consequences? Are they ever dangerous?

  • Why do you think conspiracy theories have become so popular on sites like YouTube?

  • The article states, “The company has said its recommendations drive over 70 percent of the more than one billion hours people spend watching videos each day, making the software that picks the recommendations among the world’s most influential algorithms.” Have you ever watched a video recommended by YouTube? In your experience, how important are these recommendations in driving traffic to videos? What responsibility does YouTube have, if any, to make sure these recommended videos meet a standard for truth? Or, is it only the responsibility of viewers to verify the accuracy of claims?

Students 13 and older are invited to comment. All comments are moderated by the Learning Network staff, but please keep in mind that once your comment is accepted, it will be made public.