The Interplay between Information and Estimation Measures (Foundations and Trends® in Signal Processing)

個数:

The Interplay between Information and Estimation Measures (Foundations and Trends® in Signal Processing)

  • オンデマンド(OD/POD)版です。キャンセルは承れません。
  • 【入荷遅延について】
    世界情勢の影響により、海外からお取り寄せとなる洋書・洋古書の入荷が、表示している標準的な納期よりも遅延する場合がございます。
    おそれいりますが、あらかじめご了承くださいますようお願い申し上げます。
  • ◆画像の表紙や帯等は実物とは異なる場合があります。
  • ◆ウェブストアでの洋書販売価格は、弊社店舗等での販売価格とは異なります。
    また、洋書販売価格は、ご注文確定時点での日本円価格となります。
    ご注文確定後に、同じ洋書の販売価格が変動しても、それは反映されません。
  • 製本 Paperback:紙装版/ペーパーバック版/ページ数 214 p.
  • 言語 ENG
  • 商品コード 9781601987488

Full Description

If information theory and estimation theory are thought of as two scientific languages, then their key vocabularies are information measures and estimation measures, respectively. The basic information measures are entropy, mutual information and relative entropy. Among the most important estimation measures are mean square error (MSE) and Fisher information. Playing a paramount role in information theory and estimation theory, those measures are akin to mass, force and velocity in classical mechanics, or energy, entropy and temperature in thermodynamics.

The Interplay Between Information and Estimation Measures is intended as handbook of known formulas which directly relate to information measures and estimation measures. It provides intuition and draws connections between these formulas, highlights some important applications, and motivates further explorations. The main focus is on such formulas in the context of the additive Gaussian noise model, with lesser treatment of others such as the Poisson point process channel.

Also included are a number of new results which are published here for the first time. Proofs of some basic results are provided, whereas many more technical proofs already available in the literature are omitted. In 2004, the authors of this monograph found a general differential relationship commonly referred to as the I-MMSE formula.

In this book a new, complete proof for the I-MMSE formula is developed, which includes some technical details omitted in the original papers relating to this. It concludes by highlighting the impact of the information-estimation relationships on a variety of information-theoretic problems of current interest, and provide some further perspective on their applications.

Contents

1. Introduction 2: Basic Information and Estimation Measures 3: Properties of the MMSE in Gaussian Noise 4: Mutual Information and MMSE: Basic Relationship 5: Mutual Information and MMSE in Discrete- and Continuous-time Gaussian Channels 6: Entropy, Relative Entropy, Fisher Information, and Mismatched Estimation 7: Applications of I-MMSE 8: Information and Estimation Measures in Poisson Models and Channels 9: Beyond Gaussian and Poisson Models 10: Outlook. Acknowledgements. Appendices. References

最近チェックした商品