Phoenix Web Design – Say Hello to BERT!
Yes, you heard it right. Google has a new addition to its list of algorithm. Recently, BERT was added in order to have a better understanding on the natural language of humans. This is Google’s answer to the conversational searches done online. Roughly, BERT will affect, more or less, 10% of the searches.
So, what is BERT?
For sure, you are curious now BERT. Basically, BERT is an acronym for Bidirectional Encoder Representations from Transformers. According to Cornell University, BERT started out as a research paper. Jacob Devlin, Ming Wei Chang, Kenton Lee, et al. were the main researchers on BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.
It is said that BERT can perform 11 natural language processing tasks. Surprisingly, there a lot mentions of BERT, and it ain’t just about Google’s BERT. There are other academic paper about it aside from the one mentioned above. With BERT in action, it has improved the natural language understanding (NLU). So far, Google is planning to make BERT an open source because language is dynamic. Understanding natural language won’t be an easy task; it will forever be an ongoing process.
How will BERT affect world of online search?
As early as now, the machine learning community has expressed a positive reaction to BERT. In fact, they are very delighted that this much effort has been given to understand and enhance natural language. So far, BERT has been trained to identify words, including the 2,500M words on Wikipedia.
Other companies are also rooting for BERT. Literally, everyone has their eyes on this newest advancement… even Microsoft started their own MS MARCO. The MS MARCO: A Human Generated MAchine Reading COmprehension Dataset is open source and is created to provide natural language answers. Moreover, Microsoft is heading towards enhancing process of mapping natural text. Recently, it has released MT-DNN—a Multi-Task Deep Neural Network for knowing universal languages.
Of course, Facebook will also have their own and it’s through the help of RoBERTa. SuperGLUE benchmark is another addition to identify and answer harder language tasks.
BERT Aims to Help with These Natural Language Tasks:
- Knowing when words are paraphrased, or being reworded,
- Providing answers to questions
- Determines the named entity,
- Tries to know the relationship between text fragments,
- Predicts the next sentence,
- Lastly, it aims to help resolve translation ambiguity.
Can You Prepare for BERT? Should you Optimize Your Website?
Short answer: YOU CANNOT!
You see, Google added BERT to have a better understanding of the natural language. There no rules on what it should do or not do.
BERT is here to understand what it currently published online. For example, Google Bert might say that a page is relevant since it has now better understanding of words. The same is true if it realizes that one page is optimized too much. Remember, previous updates will still be in effect, such as Panda (content), Penguin (links and anchor texts), and many others. So, just be natural. Try to write as natural as possible and you will be on BERT’s good side for sure!