BERT should improve Google search significantly 🧭

Google announced last Friday a newly developed artificial intelligence application to improve search results called BERT. It will help to understand context in the search query. And give users better responses.

Instead of treating queries as a “bag of words” – as Search VP Pandu Nayak said – BERT – or Bidirectional Encoder Representations from Transformers – considers how each word in a sentence relates to the others. Earlier search technologies discarded the sequence of words and just assumed which were most important. BERT will be available in English for US users firstly. And other countries and languages will follow.

BERT is not perfect yet. Google still has work to do when it comes to understanding what we want when we search for things. And BERT teaches itself about language constantly by playing a game, as senior VP of AI Jeff Dean explained. Google engineers trained the AI model by feeding it various paragraphs in which 10% to 15% of words were randomly deleted. BERT had to guess and understand what words needed to be filled in. The search engine processes billions of queries a day. About 15% has never encountered its AI yet.

So plenty of opportunities still to practice & learn…