Google tweaks search results to squash fake news

Google will also update its "Search Quality Rate Guidelines", which are used by human evaluators of Google's search engine results.

The first step in Google's plan is a change made last month to its Search Quality Rater guidelines - a list of red flags used by Google's human evaluation team. The feedback menus will allow users to notify Google if they feel anything that shows up in the search bar, via AutoComplete, is inappropriate. And that perception is a problem for Google.

This giant focus group, which tests out changes in the search algorithm, has been told to pay more attention to the source of any pages rated highly in results, looking round the web to see whether they seem authoritative and trustworthy. Just as editors at traditional media outlets have to curate content and separate fact from fiction, Google has to do the same on a massive scale for all the stuff published to the web.

Gomes explained how Google has long worked to prevent attempts to game the news search ranking system and has worked to keep search results as authoritative and accurate as possible. This way, "issues similar to the Holocaust denial results that we saw back in December are less likely to appear".

Feedback for Featured Snippets lets users identify possible fake news. In that environment, Google's challenge is to guard against abuse of the new feedback buttons. It's also lost lawsuits in Japan and Germany over the search suggestions. "We don't expect the problem will completely disappear". The problem isn't so much that users are searching for fake news; it's that they're being force-fed a steady diet of propaganda from friends, contacts, and fake-news sites they've already bought into.

Google's Automcomplete and Feature Snippets have both caused controversy recently for surfacing content that has not only been inaccurate, but has upset people.

The firm says it uses testers from all segments of its user base to avoid making political or biased decisions when determining ratings.

In recent months users have posted videos of a "smart device" responding with answers sponsored by racist or conspiratorial sites, such as false claims that former President Obama was plotting a coup, or allegations that four USA presidents had been members of the Ku Klux Klan (there is little evidence to suggest any US president was an active or former KKK member).

"Search can always be improved", Google Search's engineering vice president Ben Gomes said. "This can sometimes lead to results that are unexpected".

  • Essie Rivera