Blog on current news and events in the area of machine learning and deep learning

transformer-ber-ulmfit-elmo

Context matters ! Most advancement in NLP 2018-2019 has relied on adding context to embeddings and used tons of exaFLOPs, and has tripled the effectiveness from word2vec days

This article summarizes the background well

Taking the background achieved until 2019 described before, the May 2020 state-of-the art in natural language processing (NLP) would be the Electra model