Since their introduction in 2017 transformers have quickly become the dominant architecture
for achieving state-of-the-art results on a variety of natural language processing tasks. If
you're a data scientist or coder this practical book -now revised in full color- shows you how
to train and scale these large models using Hugging Face Transformers a Python-based deep
learning library. Transformers have been used to write realistic news stories improve Google
Search queries and even create chatbots that tell corny jokes. In this guide authors Lewis
Tunstall Leandro von Werra and Thomas Wolf among the creators of Hugging Face Transformers
use a hands-on approach to teach you how transformers work and how to integrate them in your
applications. You'll quickly learn a variety of tasks they can help you solve. Build debug
and optimize transformer models for core NLP tasks such as text classification named entity
recognition and question answering Learn how transformers can be used for cross-lingual
transfer learning Apply transformers in real-world scenarios where labeled data is scarce Make
transformer models efficient for deployment using techniques such as distillation pruning and
quantization Train transformers from scratch and learn how to scale to multiple GPUs and
distributed environments