CBTW invites you to their event

Transfer Learning in NLP – How to benefit from major breakthroughs with Open-Source Transformer Models

About this event

"The key technology breakthrough underlying this revolution in language AI is the Transformer" – Forbes, 2020 (Source: The Next Generation Of Artificial Intelligence)

Large amounts of Natural Language – such as PDFs, social media posts, knowledge-bases – are now available for value generation with translations, summarizations, or efficient search functions. And one key architecture has led to a huge jump in quality across these disciplines: the Transformer. You have probably already heard about the GPT-2/GPT-3 models from OpenAI that are making headlines for achieving human-like language generation power. Well, these are based on a Transformer architecture!

The jump in performance by huge models in the likes of GPT opens up a whole new range of use cases and improves upon the quality of existing ones. This of course raises the question of how one can benefit from these models. Luckily many Transformer-Models that are trained on extremely large text corpora are now open-sourced. Above all, there are smaller, easier to deploy Transformer models with competitive results which are ideally suited for industry scale projects.

This talk is NOT about applying huge language models. We FOCUS on a practical approach built on top of more efficient models. We will explain how pretrained Transformers can be leveraged with little or even no labelled (annotated) data. Also, we will intuitively illustrate how Transfer-Learning evolved in NLP and what makes the Transformer ultimately special.

👉 Get your chance to attend our next webinar and experience how we successfully implemented Transformers-Models at our clients and get answers to your most burning questions about the deployment of such models and best-practices.

Key takeaways of this webinar:

💡 Get an intuitive understanding about the main Transformers innovations,

👨‍🎓 Understand the major benefits and flaws of transfer learning using the Transformer architecture,

✏ Learn how to efficiently label data for Transformers for real-world applications,

🛠 Get details on finetuning costs and time, inference speed with Transformers,

👌 Explore a Real-World Example: applying Transformers in customer service request handling.

Who should participate:

  • Data Scientist,
  • Head of Data Science/AI,
  • Data Science/AI Manager,
  • Computer Linguist,
  • Or anyone with a stake in data science and AI, especially in NLP within any type of organization.

Can't attend live? Register now to receive the recording and the slide deck within 24 hours after the live session.

Hosted by

  • Guest speaker
    G
    Michael Brunzel Data Scientist @ Positive Thinking Company

  • Guest speaker
    G
    Christoph Hiemenz Data Scientist | Natural Language Processing @ Positive Thinking Company

CBTW

We unite around a shared vision: doing business for a better world.

We create and deliver tech and business solutions. With 3000+ people around the world, we are active in: Strategy & Governance, Product Design & Growth, Software Engineering, Data Analytics & AI, Cloud & Enterprise Platforms, Cyber Security, Banking Technology Solutions, Smart Industrial Solutions.