Mathematical Foundations of Large Language Models

  • 予約

Mathematical Foundations of Large Language Models

  • ただいまウェブストアではご注文を受け付けておりません。 ⇒古書を探す
  • 製本 Hardcover:ハードカバー版
  • 言語 ENG
  • 商品コード 9789819204656

Full Description

This book emerged from a simple observation: while Large Language Models (LLMs) have become ubiquitous, their mathematical foundations remain opaque to many practitioners, students, and enthusiasts. This book bridges that gap by presenting LLMs from first principles - not as a black box, but as an elegant mathematical construction. The journey begins with the simplest language models - bigrams and n-grams - and progressively builds toward the transformer architecture that powers modern LLMs. The emphasize is on clarity over completeness, intuition over implementation details, and mathematical rigor over hand-waving explanations. This book is self-contained and only assumes familiarity with undergraduate-level linear algebra, probability, and calculus. Wherever possible, the abstract concepts are connected to concrete examples, often using minimal two word vocabularies to illuminate general principles. The hope is that this primer serves as both an introduction for newcomers and a reference for practitioners seeking deeper understanding of the mathematical machinery underlying today's most influential AI systems.

Contents

Preamble.- ForegroundingLargeLanguageModels.- Methodspre-DatingLLM. Tokens and the Sentence Matrix.- Transformer and the Attention Matrix.- The Multi-Head Attention Matrix.- Computational Complexity of Attention.- Conditional Probability in LLM.- Softmax.- Training Large Language Models.

最近チェックした商品