- ホーム
- > 洋書
- > 英文書
- > Computer / General
Full Description
This book bridges two seemingly distinct worlds—network theory and machine learning—to reveal the universal laws of scalability that underlie both. It examines how value, capacity, and performance evolve as systems expand, offering a unified framework that connects Metcalfe's Law with neural scaling laws.
By comparing network growth and model scaling, the book uncovers striking parallels: the diminishing throughput of densely connected networks mirrors the saturation of model generalization in large AI systems. Through rigorous analytical models, it explains when performance scales sublinearly, linearly, or even superlinearly—and why these transitions matter for the future of communication infrastructure and intelligent computation.
Designed for researchers and advanced practitioners in computer networks, information theory, and artificial intelligence, this work delivers both conceptual insight and practical guidance. It helps readers recognize the structural forces that shape scalability, the mathematical trade-offs between capacity and efficiency, and the design principles that can transfer between large-scale networks and learning systems.
Readers with backgrounds in probability, linear algebra, and algorithmic modeling will find this book a compelling synthesis of theory and application—a guide to understanding how scaling behavior defines the limits and possibilities of modern computational systems.
Contents
Chapter 1: Introduction and Overview.- Chapter 2: Scaling Laws of Self-Organized Communication Networks: Throughput Capacity.- Chapter 3: Scaling Laws of Self-Organized Communication Networks: Transport Complexity.- Chapter 4: Scaling Laws of Deep-Learning Neural Networks: Taxonomy and Survey.- Chapter 5: Scaling Laws of Deep-Learning Neural Networks: Expressive Power.- Chapter 6: Scaling Laws of Deep-Learning Neural Networks: Information Loss.



