Googling Hillary Clinton

Hillary Clinton may find herself in the epicenter of another international scandal that has recently subsides, but has all chances to explode again, Sputnik News reports. The Google search engine favors Hillary Clinton in her election campaign, a group of international experts led by Harvard professor Robert Epstein believe.

According to the agency, Google's search suggestions for "Hillary Clinton" are biased in her favor. Google generates positive suggestions for the Democratic candidate, whereas other search engines were said to generate both positive and negative suggestions.

"Google is burying potential searches for terms that could have hurt Hillary Clinton in the primary elections over the past several months by manipulating recommendations on their site," the video narrated by Matt Lieberman said.

Also read: The many suspicious suicides surrounding Hillary Rodham Clinton

It goes about the so-called autocomplete technology that search engines commonly use nowadays. Google supposedly uses the technology to create a positive image of the Democratic candidate, showing only positive suggestions when a person types "Hillary Clinton" into the search box. For example, when Donald Trump stated that one needs to "investigate Clinton's crimes," Google did not include "Hillary Clinton's crimes" in its autocomplete suggestions.

"Google tries to explain away such findings by saying its search bar is programmed to avoid suggesting searches that portray people in a negative light. As far as we can tell, this claim is false," the researcher says.

"Bing and Yahoo, on the other hand, often show a number of negative suggestions in response to the same search terms. Bing and Yahoo seem to be showing us what people are actually searching for; Google is showing us something else - but what, and for what purpose?" the author of the article wonders.

The article contains a number of screenshots to illustrate what the search engines suggested, but Google introduced changes to its autocomplete algorithm as soon as the scandal erupted.

It is not easy to guess what suggestions Google was offering for "Donald Trump" and even for "Bernie Sanders," Clinton's Democratic rival.

The scandal erupted in the spring, but Google said it was not true to fact, and all went quiet. Yet, the researchers led by Robert Epstein continued their work by searching for the names of presidential candidates from different computers and making screenshots every time. Their research has confirmed that Google does not suggest negative searches in its "autocomplete" feature.

This is a serious problem indeed, as many US voters, who have doubts about their decisions, prefer to trust the Internet. The amount of such people may reach 80 percent in certain social groups.

"The impact of biased search rankings on opinions, which we call the Search Engine Manipulation Effect (SEME), is one of the largest effects ever discovered in the behavioral sciences, and because it is invisible to users, it is especially dangerous as a source of influence," the article says.

The autocomplete feature may very often guide people as they surf the Internet, and this is not an option that a user can disable. The primary function of the autocomplete technology today is to manipulate people, the researchers believe.

"Because Google handles 90 percent of search in most countries and because many elections are very close, we estimate that SEME has been determining the outcomes of upwards of 25 percent of the national elections in the world for several years now, with increasing impact each year," the researchers assume.

Interestingly, it was reported not that long ago that the European Commission was intended to seek a 3-billion-euro fine from Google for manipulating the results of search inquiries. Such claims appeared to the company in 2010, but it seems that nothing has changed since then, and Google continues running its own, obviously unfair and not very legitimate policy.

It is worthy of note that researchers from the the University of Delaware came to conclusion that social networks show influence on users' political preferences. Additionally, scientists from the University of Michigan revealed that the algorithms of social networks are arranged in a way to make people conservative and close to other points of view. It was also established that Facebook may often conduct deliberate experiments on its users.

For example, it was revealed that Facebook admins would intentionally build people's "walls" in  pessimistic, or, conversely, in happy colors. Then, the network would monitor how it would change people's mood and attitude.

After the legalization of gay marriage in the United States, Facebook conducted another social experiment by examining how quickly the social network to spread the fashion for "rainbow" avatar ", it turned out that the slightest stimulus is sufficient to adjust the millions of people around the world on a" wave " .

Subscribe to Pravda.Ru Telegram channel, Facebook, RSS!

Author`s name Dmitry Sudakov
*
X