A Google spokesman said the differences in results were not caused by censorship and that content about the Tiananmen Square massacre was available via Google search in any language or local environment. Tourist images are highlighted in some cases, a spokesman said, when a search engine detects the intent to travel, which is more likely for those closer to Beijing or typed in Chinese. Searching for Tiananmen Square from Thailand or the U.S. using Google’s Chinese language setting also searches for recent, clean images of the historic site.
“We localize the results in your desired region and language so you can quickly access the most reliable information,” a spokesman said. Google users can adjust their own results by adjusting their location and language settings.
Search Atlas contributors have also created maps and visualizations that show how search results can vary around the world. One shows how searching for images of “God” yields bearded Christian images in Europe and America, images of Buddha in some Asian countries, and the Arabic script for Allah in the Persian Gulf and Northeast Africa. A Google spokesman said the results reflect how his translation service converts the English term “God” into words with more specific meanings for some languages, such as Allah in Arabic.
The boundaries of other information drawn by researchers are not mapped directly to state or language boundaries. The results for “how to fight climate change” tend to divide island states and countries on continents. In European countries, such as Germany, the most common words in Google’s results referred to policy measures such as energy conservation and international agreements; for islands such as Mauritius and the Philippines, the results are more likely to indicate the enormity and immediacy of the threat of climate change or damage such as sea level rise.
Search Atlas was presented last month at the Interactive Systems Design Academic Conference; its creators are testing a private beta service and considering how to expand access to it.
Search Atlas can’t reveal why different versions of Google show the world differently. The company’s lucrative ranking systems are closely maintained and the company says little about how it adjusts results based on a person’s geography, language, or activity.
Regardless of the exact reason why Google shows – or doesn’t show – certain results, they have a power that is too easily overlooked, says search engine creator Atlas Ye. “People ask search engines things they would never ask a person, and things they happen to see in Google results can change their lives,” Ye says. “It could be ‘How can I have an abortion?’ “Restaurants near you or how you vote or get vaccinated. “
WIRED’s own experiments have shown how Google can direct people from neighboring countries to very different information on a hot topic. When WIRED inquired about the Atlas Search on the ongoing war in Ethiopia’s Tigray region, Google’s edition in Ethiopia pointed to Facebook pages and blogs criticizing Western diplomatic pressure to de-escalate the conflict, suggesting the U.S. and others were trying to weaken Ethiopia. The results for neighboring Kenya and the U.S. version of Google more prominent included explanations with explanations from sources such as the BBC and New York Times.
Ochigame and Ye are not the first to point out that search engines are not neutral actors. Their project is partly inspired by the work of Safiya Noble, co-founder and co-director of UCLA’s Center for Critical Internet Research. She 2018 book Oppression algorithms explored how a Google search using words like “black” or “Latin American” yielded results that reflect and reinforce social biases toward certain marginalized people.
Noble says the project could provide a way to explain the true nature of search engines to a wider audience. “It’s very difficult to make visible the ways in which search engines aren’t democratic,” she says.