Natural Language Processing with Transformers, Revised Edition

個数:

Natural Language Processing with Transformers, Revised Edition

  • 国内在庫僅少。通常5~7日で発送いたします。
    (品切れや複数冊ご注文の場合には海外お取り寄せとなり時間がかかります。)
  • 【入荷遅延について】
    世界情勢の影響により、海外からお取り寄せとなる洋書・洋古書の入荷が、表示している標準的な納期よりも遅延する場合がございます。
    おそれいりますが、あらかじめご了承くださいますようお願い申し上げます。
  • ◆画像の表紙や帯等は実物とは異なる場合があります。
  • ◆ウェブストアでの洋書販売価格は、弊社店舗等での販売価格とは異なります。
    また、洋書販売価格は、ご注文確定時点での日本円価格となります。
    ご注文確定後に、同じ洋書の販売価格が変動しても、それは反映されません。
  • 製本 Paperback:紙装版/ペーパーバック版/ページ数 406 p.
  • 言語 ENG
  • 商品コード 9781098136796
  • DDC分類 006.35

Full Description

Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library.

Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve.

Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering
Learn how transformers can be used for cross-lingual transfer learning
Apply transformers in real-world scenarios where labeled data is scarce
Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization
Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments