Abstract: One of the recent powerful advancements in natural language processing is transfer learning. Transfer learning refers to a machine learning methodology where a sophisticated model which has been trained for a generic task can be “transferred” to another task. In this workshop, we will introduce the notion of deep transformers, looking specifically at the BERT architecture. We will then carry out some practical transfer learning to leverage a pertained BERT model from TensorflowHub to build our own text classifiers. At the end of the workshop. Attendees will not only be able to create text classifiers leveraging the power of BERT, but also obtain skills allowing to adapt other pretrained models available on TensorflowHub for their needs, transfer learning style.
Prerequisites: Modern python development environment (python >= 3.6, pip).
Level of technical knowledge: Familiarity with Python and at least a light understanding of neural networks. Some familiarity with Tensorflow and Keraswould also be useful, but not required.
Schedule GMT - 9:00AM Start - 10:30 AM Break - 12:00 PM Close
Want to sponsor? Contact us to find out more.