“BERT is a system than can be tuned to do practically all tasks in NLP”

Share
  • December 19, 2019

What sets Google’s natural language processing (NLP) model BERT apart from other language models? How can a custom version version be implemented and what is the so-called ImageNetMoment?

ML Conference speaker Christoph Henkelmann (DIVISIO) answered our questions regarding these topics and shared his opinion on OpenAI’s controversial choice to initially withhold the full-parameter model of GPT-2.

 

BERT is a system than can be tuned to do practically all tasks in NLP. It’s very versatile but also really powerful

NLPChristoph Henkelmann holds a degree in Computer Science from the University of Bonn. He is currently working at DIVISIO, an AI company from Cologne, where he is CTO and co-founder. At DIVISIO, he combines practical knowledge from two decades of server and mobile development with proven AI and ML technology. In his pastime he grows cacti, practices the piano and plays video games.

The post “BERT is a system than can be tuned to do practically all tasks in NLP” appeared first on JAXenter.

Source : JAXenter