It’s not just a social media problem – how search engines spread misinformation


[ad_1]

Search engines are one of society’s main gateways to information and people, but they are also vectors of disinformation. Similar to problematic social media algorithms, search engines learn to serve you what you and others have clicked on before. Because people are drawn to the sensational, this dance between algorithms and human nature can help spread disinformation.

Search engine companies, like most online services, make money not only selling ads, but also tracking users and selling their data. via real-time auctions above. People are often led to disinformation by their desire for sensational and entertaining news as well as for information that is controversial or confirms their views. A study found that the most popular YouTube videos on diabetes are less likely to have medically valid information than less popular videos on the topic, for example.

Advertising-driven search engines like social media platforms are designed to reward clicks on engaging links because they help search companies increase their business metrics. As a researcher who studies search and recommendation systems, my colleagues and I show that this dangerous combination of corporate profit motivation and individual susceptibility makes the problem difficult to solve.

How the search results go wrong

When you click on a search result, the search algorithm learns that the link you clicked is relevant to your search query. It’s called relevance feedback. These comments help the search engine to give that link more weight for that query in the future. If enough people click on that link enough times, giving it significant relevance feedback, that website will start showing higher in search results for that link and related queries.

People are more likely to click on links displayed above on the list of search results. This creates a positive feedback loop – the higher a website appears, the more clicks there are, causing that website to move up or keep it higher. Search engine optimization techniques use this knowledge to increase website visibility.

There are two aspects to this misinformation problem: how a search algorithm is rated and how humans respond to headlines, headlines, and snippets. Search engines, like most online services, are rated using a variety of metrics, including user engagement. It is in the best interest of search engine companies to give you things that you want to read, watch, or just click. Therefore, when a search engine or any recommendation system creates a list of items to present, it calculates the likelihood that you will click on the items.

Traditionally, this was aimed at bringing out the most relevant information. However, the notion of relevance has become blurred because people use search to find entertaining search results as well as really relevant information.

Imagine that you are looking for a piano tuner. If someone shows you a video of a cat playing the piano, would you click on it? Many would, even if it has nothing to do with piano tuning. The research service feels validated with positive relevance feedback and is learning that it is okay to show a cat playing the piano when people are looking for piano tuners.

In fact, it’s even better than showing the relevant results in many cases. People love to watch funny chat videos, and the search system gets more clicks and user engagement.

It may sound harmless. So what if people get distracted every now and then and click on results that aren’t relevant to the search query? The problem is, people are drawn to exciting images and sensational headlines. They tend to click conspiracy theories and sensational news, not just cats playing the piano, and doing it more than clicking on real news or relevant information.

Famous but fake spiders

In 2018, search for “new deadly spider” doped on google following a Facebook post claiming that a deadly new spider had killed several people in multiple states. My colleagues and I analyzed the top 100 Google search results for “deadly new spider” in the first week of this trending query.

The first two pages of Google search results for “deadly new spider” in August 2018 (shaded area) related to the original fake news on this topic, not factual or debunking information.
Chirag Shah, CC BY-ND

It turned out that this story was wrong, but those looking for it were largely exposed to misinformation related to the original bogus post. As people continued to click and share this misinformation, Google continued to push these pages to the top of search results.

This model of exciting, unverified stories emerges and people click on them, with people seemingly indifferent to the truth or believing that if a trusted service like Google Search shows them these stories, then the stories must be true. More recently, a refuted report claiming that China let the coronavirus leak from a lab has gained traction in search engines because of this vicious cycle.

Spot the misinformation

To test how well people distinguish between accurate information and misinformation, we designed a simple game called “Google or not. “This online game displays two sets of results for the same query. The goal is simple: choose the set that is reliable, trustworthy, or most relevant.

A screenshot showing two sets of Google search results side by side
In tests, about half the time, people can’t tell the difference between Google search results that contain wrong information and those that only contain reliable results.
Chirag Shah, CC BY-ND

One of these two sets has one or two results that are either verified and labeled as misinformation or a debunked story. We made the game publicly available and announced through various social media channels. In total, we collected 2,100 responses from over 30 countries.

When we analyzed the results, we found that about half the time people mistakenly chose the set as trustworthy with one or two misinformation results. Our experiments with hundreds of other users over many iterations have yielded similar results. In other words, about half the time people choose results that contain conspiracy theories and fake news. As more and more people choose these inaccurate and misleading results, search engines are learning that this is what people want.

Big Tech regulatory and self-regulatory issues aside, it’s important for people to understand how these systems work and how they make money. Otherwise, market economies and the natural tendency for people to be drawn to eye-catching links will keep the vicious circle going.

[Understand new developments in science, health and technology, each week. Subscribe to The Conversation’s science newsletter.]

[ad_2]

Rosemary S. Bishop