Skip to main content
(02) 4948 8139 0421 647 317

Google Autocomplete – How it Works

In our continuing series about Google autocomplete, we take a look at when, where and how it works in Search. (We first examined it when it had its previous name in the articles: Making use of Google’s Search Suggestions and then recently re-visited it in: Changes to Search Suggestions in Google Results).

Autocomplete is available most anywhere you find a Google search box, including the Google home page, the Google app for iOS and Android, the quick search box from within Android and the “Omnibox” address bar within Chrome. Just begin typing and the predictions appear, varying from one Searcher to another because the list may include any related past searches.

If a past search is appearing on a desktop, the word “Remove” appears next to a prediction. Click on that word if you want to delete the past search. (It’s possible to delete all your past searches in bulk, or by particular dates or those matching particular terms using My Activity in your Google Account).

For example, typing “Sydney w” brings up predictions such as “Sydney weather” making it easy to finish entering the search on these topics without typing all the letters. Autocomplete is especially useful for using on mobile devices, making it easy to complete a search on a small screen where typing can be hard. Typically up to 10 predictions are seen on desktops and up to 5 on mobiles.

Google call these “predictions” rather than “suggestions,” for a good reason. Autocomplete is designed to help complete a search people were intending to do, not to suggest new types of searches to be performed.  Those predictions are determined by looking at the real searches that happen on Google and showing common and trending ones relevant to the characters that are entered and also related to the Searcher’s location and previous searches.

Interestingly, Google has been in legal trouble over the feature. Courts in Japan have ruled on autocomplete. They also lost cases in France and in Italy and an Irish hotel has sued Google over predictions. So they remove some from autocomplete, such as piracy related terms and adult terms, but when it comes to reputation management, Google prefers to let the algorithm do its work.

These are removed:

  • Sexually explicit predictions that are not related to medical, scientific, or sex education topics
  • Hateful predictions against groups and individuals on the basis of race, religion or several other demographics
  • Violent predictions
  • Dangerous and harmful activity in predictions

The guiding principle is that autocomplete should not shock users with unexpected or unwanted predictions. Google’s systems are designed to automatically catch inappropriate predictions and not show them, but they can still get shown. They strive to quickly remove those however, as in one case in Tokyo in 2013, a search on a particular man’s name provided suggestions that the man committed criminal acts.  Google was ordered to pay the man $3,100 in defamation damages for the mental anguish the search suggestion caused him.

Google’s defence has been that their autocomplete predictions are automatically generated based on what people are searching for and the content which already exists on the Internet, maintaining a position of neutrality. That’s a fair point, as when there are sufficient searches and content created about a subject which Google’s algorithm sees fit to display as a recommended search result, then, is it the search engine’s fault for honestly displaying what people are saying online?

In relation to this, Google states “even if the context behind a prediction is good, even if a prediction is infrequent, it’s still an issue if the prediction is inappropriate. It’s our job to reduce these as much as possible”.

To better deal with inappropriate predictions, they launched a feedback tool last year and have been using the data since to make improvements to their systems and their removal policy has recently been expanded for criteria applying to hate and violence.

If an inapproriate prediction is spotted, it can be reported by using the “Report inappropriate predictions” link Google launched last year, which appears below the search box on desktops.

You can read more about Google’s autocomplete here.

If you want to know more about how Google’s working to reduce inappropriate predictions and how using autocomplete can help your business, contact us now.