BERT: one NLP model to rule them all (Practical AI #22)

Published: Nov. 27, 2018, 4:11 p.m.

***Fully Connected** \u2013 a series where Chris and Daniel keep you up to date with everything that\u2019s happening in the AI community.*\n\nThis week we discuss BERT, a new method of pre-training language representations from Google for natural language processing (NLP) tasks. Then we tackle Facebook's Horizon, the first open source reinforcement learning platform for large-scale products and services. We also address synthetic data, and suggest a few learning resources.