"Bidirectional Encoder Representations from Transformers (BERT) is a technique for NLP (Natural Language Processing) pre-training developed by Google. BERT was created and published in 2018 by Jacob Devlin and his colleagues from Google. Google is leveraging BERT to better understand user searches."
The BERT Encoder block implements the BERT—Bidirectional Encoder Representations from Transformers—network in its base size, as published in BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.
BERT pushes the state of the art in Natural Language Processing by combining two powerful technologies:
BERT Explained: State of the art language model for NLP
- It is based on a deep Transformer encoder network, a type of network that can process long texts efficiently by using attention.
- It is bidirectional, meaning that it uses the whole text passage to understand the meaning of each word.