By employing New Search Quality Rater Guidelines for low-quality pages that possess false information, unexpected offensive results, hoaxes and unsupported conspiracy theories.
It's not clear what happens when an autocompleted search or snippet is marked as inaccurate or inoffensive, however. Users will be able to report autocomplete suggestions as either "hateful", "sexually explicit", "violent or includes unsafe and harmful activity", or "other". The American company has also updated guidance to its employees who evaluate the quality of results produced by... Those show up during a search session where Google will try to offer the most authoritative answer while it's in the process of finding your answer. The fake news will be down-rated in search results by tweaking signals, such as the freshness and the frequency of a site's appearance, that are taken into account when ranking a web page. He says that the problem now is the "spread of blatantly misleading, low quality, offensive or downright false information".
Google also has reprogrammed a popular feature to omit derogatory suggestions from its automated recommendations of search requests.
Fake news has been a trending term among tech companies in recent months, and Google is taking a more proactive approach toward the issue.
Human evaluators use the guidelines to rate the quality of search results.
In a blog, Google said the changes should thwart attempts to abuse its algorithms that let extremists promote their content.
Sentence next for Volkswagen in US diesel emissions scandal
The plea agreement called for "organization probation" in which the company would be overseen by an independent monitor. Thompson as we press forward with the biggest change process in Volkswagen's history", said Ms.
The moves follow months after criticism of Google and Facebook Inc. for hosting misleading information, particular tied to the 2016 USA presidential election.
Google's Autocomplete update allows for flagging content.
Google has done its best to play down the extent of fake news and hateful material - or what it prefers to call "low quality content" - in search results. While those people don't affect search results in real time, they do provide feedback on whether the changes to the algorithms are working, Gomes wrote.
"That feedback is then used to reshape the algorithms - the recipes, if you will -that Google uses". According to the search engine optimizing service MozCast, now about 15 per cent of Google searches return a result including a featured snippet, which on Google.com just looks like a text box - one of many results - off to the right side.
Only about 0.25 percent of Google's search results were being polluted with falsehoods, Gomes said.
"They simply give feedback about whether the results are good".