Here’s how Google search works

This week, President Donald Trump shared a Fox News clip featuring Kevin Cernekee, a former Google engineer who claims he was fired for being conservative, saying that when Trump was elected in 2016, Google executives said they would use their power “to control the flow of information to the public and make sure that Trump loses in 2020.”

The president continued his Twitter tirade by saying that Google is using biased search results to make him lose the 2020 elections.

The tweets show a misunderstanding of how Google Search and its algorithms work.

“The statements made by this disgruntled former employee are absolutely false,” a Google spokesperson told Recode. “We go to great lengths to build our products and enforce our policies in ways that don’t take political leanings into account. Distorting results for political purposes would harm our business and go against our mission of providing helpful content to all of our users.”

But Trump’s comments also underscore that Google is in a precarious place. Conservative commentators and politicians have accused the company of building products, including its dominant search engine, to have anti-conservative bias. Liberal politicians, in addition to some of Google’s own workforce, want the company to be broken up for what they see as monopolistic practices. And internally Google is facing its own bipartisan debate that’s a microcosm of the one being waged in the US.

But there’s no proof that Google prioritizes left-leaning news.

“I don’t want to say it’s impossible, but I don’t know of any studies that have actually looked at it and found search result manipulation [by Google],” Jim Jansen, a computer science professor at Pennsylvania State University, told Recode. “Google’s position generally has been not to mess with organic search results to avoid this type of criticism,” he said, adding that Google has no incentive to display a bias against conservative news.

Instead, it uses a series of algorithms to surface what it thinks are the most relevant answers from “authoritative sources.”

“I personally find it difficult to believe there would be some type of intentional search manipulation widespread at Google,” Jansen said. “It’s just such an algorithmic process, with so many people involved and so many layers of oversight. Just from a project management standpoint it would be quite a challenge.”

But that isn’t to say that algorithms don’t get things wrong and aren’t gamed.

Google did play an unwitting role in the 2016 election of Donald Trump, serving as a Russian propaganda arm and being used to disseminate fake news. One way Google is trying to correct these mistakes is by deprioritizing information in search results that is obviously not meant to be helpful, such as intentional manipulation by Russian operatives.

For conservatives worried about biased Google searches, Cernekee’s claims validate their fear that one of the most powerful companies in the world is putting its thumb on the election scale.

How Google Search works

Google uses software to constantly “crawl” websites for new information being shared on the web. It then indexes that information and stores it on Google servers, so that the next time you type in “Russian election interference,” Google would know it can surface a new article from Vox on the topic.

To get you the most relevant information when you type in those words, Google uses a series of algorithms that consider factors like your word choice, site traffic, how user friendly a site is, and expertise of sources, as well as your location, settings, and your own search history.

Tantamount is a website’s authoritativeness, which Google determines using a number of signals, including how often other sites link to that site and how authoritative those linking sites are. This leads to a sort of self-reinforcing situation: Google considers a site authoritative because it’s authoritative.

According to a statistical study by the Economist, Google’s search results reward accurate reporting rather than left-wing politics. To determine a publication’s accuracy, the Economist looked at how it was rated by fact-checking websites, the number of Pulitzer prizes they had, and where they fell on a YouGov poll about Americans’ trust in news sources.

The Economist noted, “If fact-checkers and Pulitzer voters are partisan, our model will be too.”

Similarly, if you don’t think widely trusted news sites are accurate, you’re not going to like Google results.

Google is constantly upgrading its search algorithms, but before it widely deploys any changes, it explores how test groups interact with the change and also consults its thousands of “search quality raters,” who A/B test the relevance of the new and old results to see which gives the better answers. The company’s North Star, it says, is returning relevant results from — you guessed it — authoritative sources.

Google raters don’t decide what qualifies as an authoritative source, but they do say whether the information from that source was the information they were looking for.

Nowhere in the guidelines or in its algorithms, Google says, does it assess a site’s political ideology.

Google’s algorithms are far from perfect, and there have been numerous instances where its search results have been unauthoritative or have been gamed.

Google’s answer boxes that appear directly at the top of a search query and thus seem like the “one true answer” have been plagued with mishaps, like surfacing false results that say former presidents were members of the KKK or that President Obama was instituting martial law.

After the 2016 election, searching “Did the Holocaust happen?” brought up an article from Stormfront, a white nationalist site that specializes in propaganda, titled “Top 10 reasons why the Holocaust didn’t happen.”

While the article certainly fits the bill when it came to matching the words in the query, it is not from a trustworthy source. It’s from a source that’s purposefully distributing hateful disinformation. In response, Google updated its algorithm to preference more authoritative sources, especially in cases like these that are susceptible to misinformation.

Google kept the page up and you’re still able to find it with specific searches, but it doesn’t come up anymore as an answer to an open question. (The first result now is from the US Holocaust Memorial Museum under the title “Holocaust Denial and Distortion.”)

And Google’s algorithms can display bias — but not against the groups and ideologies that some conservatives claim are its targets. Safiya Noble, a USC professor and author of the book Algorithms of Oppression, argues that negative biases against women of color are embedded within algorithms, since, as she told the New Yorker, marginalized groups are “least likely to have the resources to purchase keywords and least likely to technically optimize content in their own interests.”

Siva Vaidhyanathan, director of the University of Virginia Center for Media and Citizenship and author of the book The Googlization of Everything, told Recode that people who believe Google is biased against conservatives are missing the whole point. “Google can be biased, but it’s biased toward high engagement and interest, and interest is reflected both by locality and personal history,” he said.

“They’re taking an extremely narcissistic view of Google,” he added. “They want to believe that they are so important and their issues are so important in a timeless manner that there are people carefully manipulating their search results. That’s just not how Google works.”

Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.

Article source: https://www.vox.com/recode/2019/8/7/20756726/trump-google-biased-conservatives-how-search-works-explained

Related Posts