Did you Say Hello to BERT already? With BERT, search engines can now have a better understanding on how humans communicate. BERT, a new addition to Google’s algorithmic updates, will help understand natural language better. Of course, they will be things that machines won’t easily understand.
Since BERT is now in the picture, search engines hope to solve the following problem areas:
Lots and lots of words
- Since Google values content, there are a humongous pile of content out there. That would also mean, words are everywhere. How can search engines understand everything? Well, this is why BERT is created.
- With content being in every part of the Internet, words can get ambiguous. It could imply a totally different meaning. BERT aims to solve what a content is trying to impart its readers, and understanding the sentences & phrases as a whole.
Having multiple meanings
- The English language is pretty complex. Take for example, the “love” could be a noun or a verb. The search engines need to identify whether a page is taking about the noun or the verb.
- If you think that’s already bad enough, spoken English is worse because of homonyms, which are words that pronounced alike but different in meaning. For example, “read” (the past tense of read) and “red” (the color) sound the same. One needs to listen to the whole sentence to know its meaning.
- Generally, words may sound or mean different depending on the intonation, stress, and tone of the speaker. If you aren’t careful, then it is easy to misinterpret the words meant.
- Aside from that, we – humans – won’t find it a difficult task since we are born with critical thinking. As for machines and robots, these thing do not understand the context of a written article or a speech. No common sense to help them understand the given situation. Yes, this part would be a big challenge for the future technologies.
Understand the meaning behind the words.
- As words accumulate, it’s harder to understand the context. As the great American painter Kenneth Noland once said, “For me context is the key – from that comes the understanding of everything.”
- Search engines need to work double time to understand the context of the words. Of course, things should not be taken too literally. One has to learn to read between the lines. The sentence’s meaning could totally change depending on the context.
- For example, “I don’t like you to talk like that.” The first “like” means a verb while the second “like” is synonymous to the adj similar. Thus, a word’s meaning can change depend on the other words surrounding it.
The Search Engines’ Ability to Recognize
Remember, the ability to recognize is NOT the same as understanding.
NLR is not NLU; hence, natural language understanding is deeper than recognition because search engines need to understand the context. Google knows about the problems regarding natural language understanding. That is why now, they are slowly introducing BERT.
Gradually, more updates will be released for machines to adapt and understand like humans. Certainly, there are still gaps to fill. So far, Google is planning to make it a collaborative effort by making BERT open source.