Entropy Measures, Maximum Entropy Principle and Emerging Applications (Studies in Fuzziness and Soft Computing Vol.119) (2003. 297 p.)

個数:

Entropy Measures, Maximum Entropy Principle and Emerging Applications (Studies in Fuzziness and Soft Computing Vol.119) (2003. 297 p.)

  • 在庫がございません。海外の書籍取次会社を通じて出版社等からお取り寄せいたします。
    通常6~9週間ほどで発送の見込みですが、商品によってはさらに時間がかかることもございます。
    重要ご説明事項
    1. 納期遅延や、ご入手不能となる場合がございます。
    2. 複数冊ご注文の場合は、ご注文数量が揃ってからまとめて発送いたします。
    3. 美品のご指定は承りかねます。

    ●3Dセキュア導入とクレジットカードによるお支払いについて

  • 提携先の海外書籍取次会社に在庫がございます。通常3週間で発送いたします。
    重要ご説明事項
    1. 納期遅延や、ご入手不能となる場合が若干ございます。
    2. 複数冊ご注文の場合は、ご注文数量が揃ってからまとめて発送いたします。
    3. 美品のご指定は承りかねます。

    ●3Dセキュア導入とクレジットカードによるお支払いについて
  • 【入荷遅延について】
    世界情勢の影響により、海外からお取り寄せとなる洋書・洋古書の入荷が、表示している標準的な納期よりも遅延する場合がございます。
    おそれいりますが、あらかじめご了承くださいますようお願い申し上げます。
  • ◆画像の表紙や帯等は実物とは異なる場合があります。
  • ◆ウェブストアでの洋書販売価格は、弊社店舗等での販売価格とは異なります。
    また、洋書販売価格は、ご注文確定時点での日本円価格となります。
    ご注文確定後に、同じ洋書の販売価格が変動しても、それは反映されません。
  • 製本 Hardcover:ハードカバー版/ページ数 297 p.
  • 商品コード 9783540002420

Full Description

The last two decades have witnessed an enormous growth with regard to ap­ plications of information theoretic framework in areas of physical, biological, engineering and even social sciences. In particular, growth has been spectac­ ular in the field of information technology,soft computing,nonlinear systems and molecular biology. Claude Shannon in 1948 laid the foundation of the field of information theory in the context of communication theory. It is in­ deed remarkable that his framework is as relevant today as was when he 1 proposed it. Shannon died on Feb 24, 2001. Arun Netravali observes "As if assuming that inexpensive, high-speed processing would come to pass, Shan­ non figured out the upper limits on communication rates. First in telephone channels, then in optical communications, and now in wireless, Shannon has had the utmost value in defining the engineering limits we face". Shannon introduced the concept of entropy. The notable feature of the entropy frame­ work is that it enables quantification of uncertainty present in a system. In many realistic situations one is confronted only with partial or incomplete information in the form of moment, or bounds on these values etc. ; and it is then required to construct a probabilistic model from this partial information. In such situations, the principle of maximum entropy provides a rational ba­ sis for constructing a probabilistic model. It is thus necessary and important to keep track of advances in the applications of maximum entropy principle to ever expanding areas of knowledge.

Contents

1 Uncertainty, Entropy and Maximum Entropy Principle — An Overview.- 1.1 Uncertainty.- 1.2 Measure of Uncertainty in Random Phenomena.- 1.3 Shannon's Entropy.- 1.4 Properties of Shannon's Entropy.- 1.5 Asymptotic Equipartition Property (AEP).- 1.6 Joint and Conditional Entropies, Mutual Information.- 1.7 Kullback-Leibler (KL) Directed Divergence.- 1.8 Entropy of Continuous Distribution: Boltzmann Entropy.- 1.9 Entropy and Applications.- 1.10 Weighted Entropy.- 1.11 Fuzzy Uncertainty.- 1.12 Generalized Measures of Entropy.- 1.13 Maximum Entropy Principle.- 1.14 Entropy and MEP based applications.- 1.15 Conclusions.- References.- 2 Facets of Generalized Uncertainty-based Information.- 2.1 Introduction.- 2.2 Uncertainty Formalization.- 2.3 Uncertainty Measurement.- 2.4 Uncertainty Utilization.- 2.5 Conclusions.- References.- 3 Application of the Maximum (Information) Entropy Principle to Stochastic Processes far from Thermal Equilibrium.- 3.1 Introduction.- 3.2 The Fokker-Planck Equation Belonging to the Short-Time Propagator.- 3.3 Correlation Functions as Constraints.- 3.4 Calculation of the Lagrange Multipliers.- 3.5 Practical Feasibility.- 3.6 Concluding Remarks.- References.- 4 Maximum Entropy Principle, Information of Non-Random Functions and Complex Fractals.- 4.1 Introduction.- 4.2 MEP and Entropy of Non-Random Functions.- 4.3 Fractional Brownian Motion of Order n.- 4.4 Maximum Entropy Principle and Fractional Brownian Motion.- 4.5 Concluding Remarks.- References.- 5 Geometric Ideas in Minimum Cross-Entropy.- 5.1 Introduction.- 5.2 "Pythagoran" theorem and projection.- 5.3 Differential geometry.- 5.4 Hausdorff dimension.- References.- 6 Information-Theoretic Measures for Knowledge Discovery and Data Mining.- 6.1 Introduction.- 6.2 Analysis of InformationTables.- 6.3 A Review of Information-Theoretic Measures.- 6.4 Information-theoretic Measures of Attribute Importance.- 6.5 Conclusion.- References.- 7 A Universal Maximum Entropy Solution for Complex Queueing Systems and Networks.- 7.1 Introduction.- 7.2 The Principle of ME.- 7.3 The GE Distribution.- 7.4 ME Analysis of a Complex G/G/1/N Queue.- 7.5 ME Analysis of Complex Open Queueing Networks.- 7.6 Conclusions and Further Comments.- References.- 8 Minimum Mean Deviation from the Steady-State Condition in Queueing Theory.- 8.1 Introduction.- 8.2 Mathematical Formalism.- 8.3 Number of Arrivals.- 8.4 Interarrival Time.- 8.5 Service Time.- 8.6 Computer Program.- 8.7 Conclusion.- References.- 9 On the Utility of Different Entropy Measures in Image Thresholding.- 9.1 Introduction.- 9.2 Summarization of Image Information.- 9.3 Measures of Information.- 9.4 Thresholding with Entropy Measures.- 9.5 Implementation and Results.- 9.6 Conclusions.- References.- 10 Entropic Thresholding Algorithms and their Optimizations.- 10.1 Introduction.- 10.2 Iterative Method for Minimum Cross Entropy Thresholding.- 10.3 Iterative Maximum Entropy Method.- 10.4 Extension to Multi-level Thresholding.- 10.5 Results and Discussions.- References.- 11 Entropy and Complexity of Sequences.- 11.1 Introduction.- 11.2 Representations of Sequences and Surrogates.- 11.3 Entropy-like Measures of Sequence Structure.- 11.4 Results of Entropy Analysis.- 11.5 Grammar Complexity and Information Content.- 11.6 Results of the Grammar Analysis.- 11.7 Conclusions.- References.- 12 Some Lessons for Molecular Biology from Information Theory.- 12.1 Precision in Biology.- 12.2 The Address is the Message.- 12.3 Breaking the Rules.- 12.4 Waves in DNA Patterns.- 12.5 On Being Blind.- 12.6 Acknowledgments.- References.- 13Computation of the MinMax Measure.- 13.1 Introduction.- 13.2 Minimum Entropy and the MinMax Measure.- 13.3 An Algorithm for the MinMax measure.- 13.4 Numerical Example: A traffic engineering problem.- 13.5 Concluding Remarks.- References.- 14 On Three Functional Equations Related to the Bose-Einstein Entropy.- 14.1 Introduction.- 14.2 Solution of equations (14.4) and (14.5).- 14.3 Solution of the equation (14.6).- References.- 15 The Entropy Theory as a Decision Making Tool in Environmental and Water Resources.- 15.1 Introduction.- 15.2 Entropy Theory.- 15.3 Other Representations of Entropy.- 15.4 Entropy as a Decision Making Tool in Environmental and Water Resources.- 15.5 Implications for Developing Countries.- 15.6 Concluding Remarks.- References.

最近チェックした商品