Mastering Transformers : The Journey from BERT to Large Language Models and Stable Diffusion (2ND)

個数:

Mastering Transformers : The Journey from BERT to Large Language Models and Stable Diffusion (2ND)

  • 提携先の海外書籍取次会社に在庫がございます。通常3週間で発送いたします。
    重要ご説明事項
    1. 納期遅延や、ご入手不能となる場合が若干ございます。
    2. 複数冊ご注文の場合、分割発送となる場合がございます。
    3. 美品のご指定は承りかねます。

    ●3Dセキュア導入とクレジットカードによるお支払いについて
  • 【入荷遅延について】
    世界情勢の影響により、海外からお取り寄せとなる洋書・洋古書の入荷が、表示している標準的な納期よりも遅延する場合がございます。
    おそれいりますが、あらかじめご了承くださいますようお願い申し上げます。
  • ◆画像の表紙や帯等は実物とは異なる場合があります。
  • ◆ウェブストアでの洋書販売価格は、弊社店舗等での販売価格とは異なります。
    また、洋書販売価格は、ご注文確定時点での日本円価格となります。
    ご注文確定後に、同じ洋書の販売価格が変動しても、それは反映されません。
  • 製本 Paperback:紙装版/ペーパーバック版/ページ数 462 p.
  • 言語 ENG
  • 商品コード 9781837633784
  • DDC分類 006.35

Full Description

Explore transformer-based language models from BERT to GPT, delving into NLP and computer vision tasks, while tackling challenges effectively

Key Features

Understand the complexity of deep learning architecture and transformers architecture
Create solutions to industrial natural language processing (NLP) and computer vision (CV) problems
Explore challenges in the preparation process, such as problem and language-specific dataset transformation
Purchase of the print or Kindle book includes a free PDF eBook

Book DescriptionTransformer-based language models such as BERT, T5, GPT, DALL-E, and ChatGPT have dominated NLP studies and become a new paradigm. Thanks to their accurate and fast fine-tuning capabilities, transformer-based language models have been able to outperform traditional machine learning-based approaches for many challenging natural language understanding (NLU) problems.
Aside from NLP, a fast-growing area in multimodal learning and generative AI has recently been established, showing promising results. Mastering Transformers will help you understand and implement multimodal solutions, including text-to-image. Computer vision solutions that are based on transformers are also explained in the book. You'll get started by understanding various transformer models before learning how to train different autoregressive language models such as GPT and XLNet. The book will also get you up to speed with boosting model performance, as well as tracking model training using the TensorBoard toolkit. In the later chapters, you'll focus on using vision transformers to solve computer vision problems. Finally, you'll discover how to harness the power of transformers to model time series data and for predicting.
By the end of this transformers book, you'll have an understanding of transformer models and how to use them to solve challenges in NLP and CV.What you will learn

Focus on solving simple-to-complex NLP problems with Python
Discover how to solve classification/regression problems with traditional NLP approaches
Train a language model and explore how to fine-tune models to the downstream tasks
Understand how to use transformers for generative AI and computer vision tasks
Build transformer-based NLP apps with the Python transformers library
Focus on language generation such as machine translation and conversational AI in any language
Speed up transformer model inference to reduce latency

Who this book is forThis book is for deep learning researchers, hands-on practitioners, and ML/NLP researchers. Educators, as well as students who have a good command of programming subjects, knowledge in the field of machine learning and artificial intelligence, and who want to develop apps in the field of NLP as well as multimodal tasks will also benefit from this book's hands-on approach. Knowledge of Python (or any programming language) and machine learning literature, as well as a basic understanding of computer science, are required.

Contents

Table of Contents

From Bag-of-Words to the Transformer
A Hands-On Introduction to the Subject
Autoencoding Language Models
Autoregressive Language Models
Fine-Tuning Language Model for Text Classification
Fine-Tuning Language Models for Token Classification
Text Representation
Boosting Your Model Performance
Parameter Efficient Fine-Tuning
Zero-Shot and Few-Shot Learning in NLP
Explainable AI (XAI) for NLP
Working with Efficient Transformers
Cross-Lingual Language Modeling
Serving Transformer Models
Model Tracking and Monitoring
Vision Transformers
Tabular Transformers
Multi-Model Transformers

最近チェックした商品