Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds
Mastering Transformers
Mastering Transformers

Mastering Transformers: Build state-of-the-art models from scratch with advanced natural language processing techniques

Arrow left icon
Profile Icon Savaş Yıldırım Profile Icon Meysam Asgari- Chenaghlu
Arrow right icon
$43.99
Full star icon Full star icon Full star icon Full star icon Empty star icon 4 (9 Ratings)
eBook Sep 2021 374 pages 1st Edition
eBook
$43.99
Paperback
$54.99
Subscription
Free Trial
Renews at $12.99p/m
Arrow left icon
Profile Icon Savaş Yıldırım Profile Icon Meysam Asgari- Chenaghlu
Arrow right icon
$43.99
Full star icon Full star icon Full star icon Full star icon Empty star icon 4 (9 Ratings)
eBook Sep 2021 374 pages 1st Edition
eBook
$43.99
Paperback
$54.99
Subscription
Free Trial
Renews at $12.99p/m
eBook
$43.99
Paperback
$54.99
Subscription
Free Trial
Renews at $12.99p/m

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning
OR
Modal Close icon
Payment Processing...
tick Completed

Billing Address

Key benefits

  • Explore quick prototyping with up-to-date Python libraries to create effective solutions to industrial problems
  • Solve advanced NLP problems such as named-entity recognition, information extraction, language generation, and conversational AI
  • Monitor your model's performance with the help of BertViz, exBERT, and TensorBoard

Description

Transformer-based language models have dominated natural language processing (NLP) studies and have now become a new paradigm. With this book, you'll learn how to build various transformer-based NLP applications using the Python Transformers library. The book gives you an introduction to Transformers by showing you how to write your first hello-world program. You'll then learn how a tokenizer works and how to train your own tokenizer. As you advance, you'll explore the architecture of autoencoding models, such as BERT, and autoregressive models, such as GPT. You'll see how to train and fine-tune models for a variety of natural language understanding (NLU) and natural language generation (NLG) problems, including text classification, token classification, and text representation. This book also helps you to learn efficient models for challenging problems, such as long-context NLP tasks with limited computational capacity. You'll also work with multilingual and cross-lingual problems, optimize models by monitoring their performance, and discover how to deconstruct these models for interpretability and explainability. Finally, you'll be able to deploy your transformer models in a production environment. By the end of this NLP book, you'll have learned how to use Transformers to solve advanced NLP problems using advanced models.

Who is this book for?

This book is for deep learning researchers, hands-on NLP practitioners, as well as ML/NLP educators and students who want to start their journey with Transformers. Beginner-level machine learning knowledge and a good command of Python will help you get the best out of this book.

What you will learn

  • Explore state-of-the-art NLP solutions with the Transformers library
  • Train a language model in any language with any transformer architecture
  • Fine-tune a pre-trained language model to perform several downstream tasks
  • Select the right framework for the training, evaluation, and production of an end-to-end solution
  • Get hands-on experience in using TensorBoard and Weights & Biases
  • Visualize the internal representation of transformer models for interpretability

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Sep 15, 2021
Length: 374 pages
Edition : 1st
Language : English
ISBN-13 : 9781801078894
Category :
Tools :

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning
OR
Modal Close icon
Payment Processing...
tick Completed

Billing Address

Product Details

Publication date : Sep 15, 2021
Length: 374 pages
Edition : 1st
Language : English
ISBN-13 : 9781801078894
Category :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
$12.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 6,500+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
$129.99 billed annually
Feature tick icon Unlimited access to Packt's library of 6,500+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts
$179.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 6,500+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total $ 199.97
Transformers for Natural Language Processing
$89.99
Machine Learning for Time-Series with Python
$54.99
Mastering Transformers
$54.99
Total $ 199.97 Stars icon
Visually different images

Customer reviews

Top Reviews
Rating distribution
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
(9 Ratings)
5 star 33.3%
4 star 55.6%
3 star 0%
2 star 0%
1 star 11.1%
Filter icon Filter
Top Reviews

Filter reviews by




Kelvin D. Meeks Sep 15, 2021
Full star icon Full star icon Full star icon Full star icon Full star icon 5
A few other key words that come to mind to describe this book: Foundational, Hands-on, Practical, Crisp, Concise, Depth & Breadth, Tremendous Value.With the continued accelerating explosion in the growth of unstructured data collected by enterprises in texts and documents – the need to be able to analyze and derive meaningful information is more critical than ever – and will be the competitive advantage that distinguishes future winners from losers in the marketplace of solutions. This book is an investment in expanding your awareness of the techniques and capabilities that will help you navigate those challenges.From the book: “Transformer models have gained immense interest because of their effectiveness in all NLP tasks, from text classification to text generation….[and] effectively improve the performance of multilingual and multi-task NLP problems, as well as monolingual and single tasks.”This book is a practical guide to leveraging (and applying) some of the leading-edge concepts, algorithms, and libraries from the fields of Deep Learning (DL) and Natural Language Processing (NLP) to solve real-world problems – ranging from summarization to question-answering.In particular, this book will serve as a gentle guided tour of some of the important advances that have occurred (and continue to evolve) as the transformer architecture gradually evolved into an attention-based encoder-decoder architecture.What I particularly liked:The deep subject-matter experience and credentials of the authors (“Savaş Yıldırım graduated from the Istanbul Technical University Department of Computer Engineering and holds a Ph.D. degree in Natural Language Processing (NLP). Currently, he is an associate professor at the Istanbul Bilgi University, Turkey, and is a visiting researcher at the Ryerson University, Canada. He is a proactive lecturer and researcher with more than 20 years of experience teaching courses on machine learning, deep learning, and NLP.”, “Meysam Asgari-Chenaghlu is an AI manager at Carbon Consulting and is also a Ph.D. candidate at the University of Tabriz.”)The companion “Code In Action” YouTube channel playlist for the book, and the GitHub repository with code examples.The excellent quality/conciseness/crispness of the writing.The extensive citation of relevant research papers – and references at the end of chapters.The authors’ deep practical knowledge – and discussions – of the advantages and disadvantages of different approaches.The exquisitely balanced need for technical depth in the details covered by a given chapter – with the need to maintain a steady pace of educating & keeping the reader engaged. Some books go too deep, and some stay too shallow. This book is exceptionally well balanced at just the right depth.The exceptional variety of examples covered.The quality of the illustrations used to convey complex concepts – Figures 1.19, 3.2, 3.3, 7.8, 9.3 are just a few examples of the many good diagrams.Chapter-1’s focus on getting the reader immediately involved in executing a hello-world example with Transformers. The overview of RNNs, FFNNs, LSTMs, and CNNs. An excellent overview of the developments in NLP over the last 10 years that led to the Tranformer architecture.Chapter-2’s guidance on installing the required software – and the suggestion of Google Colab as an alternative to Anaconda.Chapter-2’s coverage of community-provided models, benchmarks, TensorFlow, PyTorch, and Transformer - and running a simple Transformer from scratch.Chapter-3’s coverage of BERT – as well as ALBERT, RoBERTa, and ELECTRA.Chapter-4’s coverage of AR, GPT, BART, and NLG.Chapter-5’s coverage of fine-tuning language models for text classification (e.g., for sentiment analysis, or multi-class classification).Chapter-6’s coverage of NER and POS was of particular interest – given the effort that I had to expend last year doing my own deep-dive to prepare some recommendations for a client – I wish I had had this book then.Chapter-7’s coverage of USE and SBERT, zero-shot learning with BART, and FLAIR.Chapter-8’s discussion of efficient sparse parsers (Linformer, and BigBird) – as well as the techniques of distillation, pruning, and quantization – to make efficient models out of trained models. Chapter-8 may well be worth the price of the book, itself.Chapter-9’s coverage of multilingual and cross-lingual language model training (and pretraining). I found the discussion of “Cross-lingual similarity tasks” (see p-278) to be particularly interesting.Chapter-10’s coverage of Locust for load testing, fastAPI, and TensorFlow Extended (TFX) – as well as the serving of solutions in environments where CPU/GPU is available.Chapter-11’s coverage of visualization with exBERT and BertViz – as well as the discussion on tracking model training with TensorBoard and W&BThe ”Other Books You May Enjoy” section at the end of the book (“Getting Started with Google BERT”, and “Mastering spaCy”)Suggestions for the next edition:The fonts used for the text in some figures (e.g., 3.8, 3.10, 3.12, 3.13, 3.14, 4.5, 4.6, 6.2, 6.7, 8.4, 8.6, 9.4, 9.5 ) appear to be a bit fuzzy in the PDF version of the book. Compare those with the clarity of figure 6.6.
Amazon Verified review Amazon
rockstar82 Nov 14, 2021
Full star icon Full star icon Full star icon Full star icon Full star icon 5
The first part of the book is a long introduction (around 80 pages). The book starts with a review and some code related to the early concepts that led to Transformers (e.g., PCA, bag-of-words, embeddings, RNNs, LSTMs) and jumps right into the attention mechanism and the Transformer architecture in several pages. The architecture is explained through various pictures and visualizations in order to help people understand why this model works so well. The few references included at the end of various chapters can always provide the additional details. While the first chapter is mostly theoretical, the following chapter does contain the gritty details related to progamming - from installing the various libraries to various tricks needed in order to make certain models work for solving a particular problem. The author's habits of adding good images everywhere helps a lot in this endeavour.The second part of the book is focused on two big issues: a) understanding the architectural differences between autoencoding models (e.g., BERT) and autoregressive models (e.g., BERT); and b) applying such models for classification tasks (e.g., text or token classification). This is the part in which the authors teach you how to fine-tune such models.The last part of the book is dedicated to advanced topics like efficient Transforms, multilingual models or production-related issues (e.g., serving Transformers, visualizing their attention layers or outputs, etc.). This is the part that links directly to advanced research, not unlike the kind of research that was submitted to top conferences in the field a couple of years ago (e.g., up to 2019, there were hardly any Transformer visualizations!).I particularly liked the ease with which the authors jumped straight into a difficult subject and teached it up to research level. There are very few books able to do this (e.g., thinking of the Dive into Deep Learning books or Grokking Deep Learning). It is great to see publishers introducing such books extremely fast (e.g., it took 5 years to have the first D3.js book published, but only 2 years to have a book about BERT or GPT-2/3). Make no mistake, while the title is still about Transformers, the real stars are the newer models discussed here. I consider this a five star book and I would recommend you to read it along the various examples of Transformer / BERT / GPT-2 explained source code that are available online. You will thank yourself later!
Amazon Verified review Amazon
Colbert Philippe Jun 29, 2022
Full star icon Full star icon Full star icon Full star icon Full star icon 5
I love that book! It gives me the introduction and understanding on the very important topic that is Transformers. The book gives a bit of history and the current standard implementation of transformers. The book also delves in the advanced topic of Attention algorithms. This book is a "must have" for the seasoned AI partitioner of Language Model Transformers.
Amazon Verified review Amazon
Daniel Eduardo Portugal Revilla Sep 15, 2021
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
Currently, I have more experience as a data engineer but Machine Learning(ML) and Deep Learning(DL) are not some of my strangers. However, I have an attraction for NLP so that I had the opportunity to read the Mastering Transformers.This is a fantastic book all in one about Transformers. Transformers are the principal NLP state of the art. Very futuristic. The principal features of the book are to give good examples step by step of which I can mention BERT the most popular Transformer model by Google, another nice feature is that you can learn the origins of NLP, the traditional techniques, ML and DL evolution.You can find information about the principal BERT variants like ALBERT, ELECTRA, RoBERTa. I missed maybe some examples or information about BioBERT and ClinicalBERT for medical domains and medical apps.Finally, there is one of my favorites topics about ML. How to deploy and serving your model to production for real or experimental APPs.
Amazon Verified review Amazon
Sachin Kumar Sep 16, 2021
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
To start with this book is really comprehensive in its coverage of various transformers models like autoencoding models(BERT ,ROBERTa, etc.) and autoregressive models(GPT, T5, etc.), serving models, etc. with relevant code snippets and use cases to play around with.Besides that most of the code examples and text in the book appears to be straight out picked from hugging face website, and in some cases adapted for their example datasets.Therefore, this book will be a great resource for beginners or someone new to transformers models or huggingface website. However, for people who are somewhat familiar with transformers models or familiar with huggingface models, I don't think this book to be a great resource for them to spend money on, since this book appears to be just a curation of freely available articles or code snippets available on huggingface website or other blogs.On the finishing note for the people starting out in NLP or transformers, or for the people who want to save on trouble of going through articles on hugginface or across the web, I will suggest this book as a great resource. However, for the people familiar with huggingface models or who have some experience using them, I think they can freely find the information presented in the book.
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

How do I buy and download an eBook? Chevron down icon Chevron up icon

Where there is an eBook version of a title available, you can buy it from the book details for that title. Add either the standalone eBook or the eBook and print book bundle to your shopping cart. Your eBook will show in your cart as a product on its own. After completing checkout and payment in the normal way, you will receive your receipt on the screen containing a link to a personalised PDF download file. This link will remain active for 30 days. You can download backup copies of the file by logging in to your account at any time.

If you already have Adobe reader installed, then clicking on the link will download and open the PDF file directly. If you don't, then save the PDF file on your machine and download the Reader to view it.

Please Note: Packt eBooks are non-returnable and non-refundable.

Packt eBook and Licensing When you buy an eBook from Packt Publishing, completing your purchase means you accept the terms of our licence agreement. Please read the full text of the agreement. In it we have tried to balance the need for the ebook to be usable for you the reader with our needs to protect the rights of us as Publishers and of our authors. In summary, the agreement says:

  • You may make copies of your eBook for your own use onto any machine
  • You may not pass copies of the eBook on to anyone else
How can I make a purchase on your website? Chevron down icon Chevron up icon

If you want to purchase a video course, eBook or Bundle (Print+eBook) please follow below steps:

  1. Register on our website using your email address and the password.
  2. Search for the title by name or ISBN using the search option.
  3. Select the title you want to purchase.
  4. Choose the format you wish to purchase the title in; if you order the Print Book, you get a free eBook copy of the same title. 
  5. Proceed with the checkout process (payment to be made using Credit Card, Debit Cart, or PayPal)
Where can I access support around an eBook? Chevron down icon Chevron up icon
  • If you experience a problem with using or installing Adobe Reader, the contact Adobe directly.
  • To view the errata for the book, see www.packtpub.com/support and view the pages for the title you have.
  • To view your account details or to download a new copy of the book go to www.packtpub.com/account
  • To contact us directly if a problem is not resolved, use www.packtpub.com/contact-us
What eBook formats do Packt support? Chevron down icon Chevron up icon

Our eBooks are currently available in a variety of formats such as PDF and ePubs. In the future, this may well change with trends and development in technology, but please note that our PDFs are not Adobe eBook Reader format, which has greater restrictions on security.

You will need to use Adobe Reader v9 or later in order to read Packt's PDF eBooks.

What are the benefits of eBooks? Chevron down icon Chevron up icon
  • You can get the information you need immediately
  • You can easily take them with you on a laptop
  • You can download them an unlimited number of times
  • You can print them out
  • They are copy-paste enabled
  • They are searchable
  • There is no password protection
  • They are lower price than print
  • They save resources and space
What is an eBook? Chevron down icon Chevron up icon

Packt eBooks are a complete electronic version of the print edition, available in PDF and ePub formats. Every piece of content down to the page numbering is the same. Because we save the costs of printing and shipping the book to you, we are able to offer eBooks at a lower cost than print editions.

When you have purchased an eBook, simply login to your account and click on the link in Your Download Area. We recommend you saving the file to your hard drive before opening it.

For optimal viewing of our eBooks, we recommend you download and install the free Adobe Reader version 9.