0% found this document useful (1 vote)
29 views3 pages

NLP With Transformers

Natural Language Processing with Transformers is a practical guide by Lewis Tunstall, Leandro von Werra, and Thomas Wolf that teaches readers how to train and scale transformer models using the Hugging Face library. The book covers core NLP tasks, cross-lingual transfer learning, and techniques for efficient model deployment, making it suitable for both beginners and experienced practitioners. It combines theoretical insights with hands-on coding examples to facilitate real-world applications of transformer models.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (1 vote)
29 views3 pages

NLP With Transformers

Natural Language Processing with Transformers is a practical guide by Lewis Tunstall, Leandro von Werra, and Thomas Wolf that teaches readers how to train and scale transformer models using the Hugging Face library. The book covers core NLP tasks, cross-lingual transfer learning, and techniques for efficient model deployment, making it suitable for both beginners and experienced practitioners. It combines theoretical insights with hands-on coding examples to facilitate real-world applications of transformer models.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Re Edi

vi tio
se n
d
Natural Language
Processing with
Transformers
Building Language Applications
with Hugging Face

Lewis Tunstall,
Leandro von Werra
& Thomas Wolf
Natural Language Processing
with Transformers
Since their introduction in 2017, transformers have quickly
become the dominant architecture for achieving state-of-the- “The preeminent book
art results on a variety of natural language processing tasks. for the preeminent
If you’re a data scientist or coder, this practical book—now
transformers library—
revised in full color—shows you how to train and scale these
large models using Hugging Face Transformers, a Python- a model of clarity!”
based deep learning library. —Jeremy Howard
Cofounder of fast.ai and professor at
Transformers have been used to write realistic news stories, University of Queensland
improve Google Search queries, and even create chatbots
that tell corny jokes. In this guide, authors Lewis Tunstall, “A wonderfully clear and
Leandro von Werra, and Thomas Wolf, among the creators incisive guide to modern
of Hugging Face Transformers, use a hands-on approach to NLP’s most essential
teach you how transformers work and how to integrate them library. Recommended!”
in your applications. You’ll quickly learn a variety of tasks they
—Christopher Manning
can help you solve. Thomas M. Siebel Professor in Machine
Learning, Stanford University
• Build, debug, and optimize transformer models for core NLP
tasks, such as text classification, named entity recognition,
and question answering Lewis Tunstall is a machine learning
• Learn how transformers can be used for cross-lingual engineer at Hugging Face. His current
work focuses on developing tools
transfer learning
for the NLP community and teaching
• Apply transformers in real-world scenarios where labeled people to use them effectively.
data is scarce Leandro von Werra is a machine
• Make transformer models efficient for deployment using learning engineer in the open source
techniques such as distillation, pruning, and quantization team at Hugging Face, where he
primarily works on code generation
• Train transformers from scratch and learn how to scale to models and community outreach.
multiple GPUs and distributed environments Thomas Wolf is chief science officer
and cofounder of Hugging Face. His
team is on a mission to catalyze and
democratize AI research.

MACHINE LEARNING Twitter: @oreillymedia


linkedin.com/company/oreilly-media
US $59.99
$59.99 CAN $74.99
CAN $79.99 youtube.com/oreillymedia
ISBN: 978-1-098-13679-6
ISBN: 978-1-098-10324-8
Praise for Natural Language Processing
with Transformers

Pretrained transformer language models have taken the NLP world by storm, while
libraries such as Transformers have made them much easier to use. Who better to
teach you how to leverage the latest breakthroughs in NLP than the creators of said
library? Natural Language Processing with Transformers is a tour de force, reflecting the
deep subject matter expertise of its authors in both engineering and research. It is the rare
book that offers both substantial breadth and depth of insight and deftly mixes research
advances with real-world applications in an accessible way. The book gives informed
coverage of the most important methods and applications in current NLP, from
multilingual to efficient models and from question answering to text generation. Each
chapter provides a nuanced overview grounded in rich code examples that highlights best
practices as well as practical considerations and enables you to put research-focused
models to impactful real-world use. Whether you’re new to NLP or a veteran, this book
will improve your understanding and fast-track your development and deployment
of state-of-the-art models.
—Sebastian Ruder, Google DeepMind

Transformers have changed how we do NLP, and Hugging Face has pioneered how we use
transformers in product and research. Lewis Tunstall, Leandro von Werra, and Thomas
Wolf from Hugging Face have written a timely volume providing a convenient and
hands-on introduction to this critical topic. The book offers a solid conceptual grounding
of transformer mechanics, a tour of the transformer menagerie, applications of
transformers, and practical issues in training and bringing transformers to production.
Having read chapters in this book, with the depth of its content and lucid presentation, I
am confident that this will be the number one resource for anyone interested in learning
transformers, particularly for natural language processing.
—Delip Rao, Author of Natural Language Processing and
Deep Learning with PyTorch

You might also like