Proportionate-type Normalized Least Mean Square Algorithms (Focus)

個数:
電子版価格
¥22,575
  • 電子版あり

Proportionate-type Normalized Least Mean Square Algorithms (Focus)

  • 在庫がございません。海外の書籍取次会社を通じて出版社等からお取り寄せいたします。
    通常6~9週間ほどで発送の見込みですが、商品によってはさらに時間がかかることもございます。
    重要ご説明事項
    1. 納期遅延や、ご入手不能となる場合がございます。
    2. 複数冊ご注文の場合は、ご注文数量が揃ってからまとめて発送いたします。
    3. 美品のご指定は承りかねます。

    ●3Dセキュア導入とクレジットカードによるお支払いについて
  • 【入荷遅延について】
    世界情勢の影響により、海外からお取り寄せとなる洋書・洋古書の入荷が、表示している標準的な納期よりも遅延する場合がございます。
    おそれいりますが、あらかじめご了承くださいますようお願い申し上げます。
  • ◆画像の表紙や帯等は実物とは異なる場合があります。
  • ◆ウェブストアでの洋書販売価格は、弊社店舗等での販売価格とは異なります。
    また、洋書販売価格は、ご注文確定時点での日本円価格となります。
    ご注文確定後に、同じ洋書の販売価格が変動しても、それは反映されません。
  • 製本 Hardcover:ハードカバー版/ページ数 192 p.
  • 言語 ENG
  • 商品コード 9781848214705
  • DDC分類 005

Full Description

The topic of this book is proportionate-type normalized least mean squares (PtNLMS) adaptive filtering algorithms, which attempt to estimate an unknown impulse response by adaptively giving gains proportionate to an estimate of the impulse response and the current measured error. These algorithms offer low computational complexity and fast convergence times for sparse impulse responses in network and acoustic echo cancellation applications.

New PtNLMS algorithms are developed by choosing gains that optimize user-defined criteria, such as mean square error, at all times. PtNLMS algorithms are extended from real-valued signals to complex-valued signals. The computational complexity of the presented algorithms is examined.

Contents

PREFACE ix

NOTATION xi

ACRONYMS xiii

CHAPTER 1. INTRODUCTION TO PTNLMS ALGORITHMS  1

1.1. Applications motivating PtNLMS algorithms 1

1.2. Historical review of existing PtNLMS algorithms 4

1.3. Unified framework for representing PtNLMS algorithms 6

1.4. Proportionate-type NLMS adaptive filtering algorithms 8

1.4.1. Proportionate-type least mean square algorithm 8

1.4.2. PNLMS algorithm 8

1.4.3. PNLMS++ algorithm 8

1.4.4. IPNLMS algorithm 9

1.4.5. IIPNLMS algorithm 10

1.4.6. IAF-PNLMS algorithm 10

1.4.7. MPNLMS algorithm 11

1.4.8. EPNLMS algorithm 11

1.5. Summary 12

CHAPTER 2. LMS ANALYSIS TECHNIQUES 13

2.1. LMS analysis based on small adaptation step-size 13

2.1.1. Statistical LMS theory: small step-size assumptions 13

2.1.2. LMS analysis using stochastic difference equations with constant coefficients 14

2.2. LMS analysis based on independent input signal assumptions 18

2.2.1. Statistical LMS theory: independent input signal assumptions 18

2.2.2. LMS analysis using stochastic difference equations with stochastic coefficients 19

2.3. Performance of statistical LMS theory 24

2.4. Summary 27

CHAPTER 3. PTNLMS ANALYSIS TECHNIQUES 29

3.1. Transient analysis of PtNLMS algorithm for white input 29

3.1.1. Link between MSWD and MSE 30

3.1.2. Recursive calculation of the MWD and MSWD for PtNLMS algorithms 30

3.2. Steady-state analysis of PtNLMS algorithm: bias and MSWD calculation 33

3.3. Convergence analysis of the simplified PNLMS algorithm 37

3.3.1. Transient theory and results 37

3.3.2. Steady-state theory and results 46

3.4. Convergence analysis of the PNLMS algorithm 47

3.4.1. Transient theory and results 48

3.4.2. Steady-state theory and results 53

3.5. Summary 54

CHAPTER 4. ALGORITHMS DESIGNED BASED ON MINIMIZATION OF USER-DEFINED CRITERIA  57

4.1. PtNLMS algorithms with gain allocation motivated by MSE minimization for white input 57

4.1.1. Optimal gain calculation resulting from MMSE 58

4.1.2. Water-filling algorithm simplifications 62

4.1.3. Implementation of algorithms 63

4.1.4. Simulation results 65

4.2. PtNLMS algorithm obtained by minimization of MSE modeled by exponential functions 68

4.2.1. WD for proportionate-type steepest descent algorithm 69

4.2.2. Water-filling gain allocation for minimization of the MSE modeled by exponential functions 69

4.2.3. Simulation results 73

4.3. PtNLMS algorithm obtained by minimization of the MSWD for colored input 76

4.3.1. Optimal gain algorithm 76

4.3.2. Relationship between minimization of MSE and MSWD 81

4.3.3. Simulation results 82

4.4. Reduced computational complexity suboptimal gain allocation for PtNLMS algorithm with colored input 83

4.4.1. Suboptimal gain allocation algorithms 84

4.4.2. Simulation results 85

4.5. Summary 88

CHAPTER 5. PROBABILITY DENSITY OF WD FOR PTLMS ALGORITHMS 91

5.1. Proportionate-type least mean square algorithms 91

5.1.1. Weight deviation recursion 91

5.2. Derivation of the conditional PDF for the PtLMS algorithm 92

5.2.1. Conditional PDF derivation 92

5.3. Applications using the conditional PDF 100

5.3.1. Methodology for finding the steady-state joint PDF using the conditional PDF 101

5.3.2. Algorithm based on constrained maximization of the conditional PDF 104

5.4. Summary 111

CHAPTER 6. ADAPTIVE STEP-SIZE PTNLMS ALGORITHMS 113

6.1. Adaptation of μ-law for compression of weight estimates using the output square error 113

6.2. AMPNLMS and AEPNLMS simplification 114

6.3. Algorithm performance results 116

6.3.1. Learning curve performance of the ASPNLMS, AMPNLMS and AEPNLMS algorithms for a white input signal 116

6.3.2. Learning curve performance of the ASPNLMS, AMPNLMS and AEPNLMS algorithms for a color input signal 117

6.3.3. Learning curve performance of the ASPNLMS, AMPNLMS and AEPNLMS algorithms for a voice input signal 117

6.3.4. Parameter effects on algorithms 119

6.4. Summary 124

CHAPTER 7. COMPLEX PTNLMS ALGORITHMS 125

7.1. Complex adaptive filter framework 126

7.2. cPtNLMS and cPtAP algorithm derivation 126

7.2.1. Algorithm simplifications 129

7.2.2. Alternative representations 131

7.2.3. Stability considerations of the cPtNLMS algorithm 131

7.2.4. Calculation of stepsize control matrix 132

7.3. Complex water-filling gain allocation algorithm for white input signals: one gain per coefficient case 133

7.3.1. Derivation 133

7.3.2. Implementation 136

7.4. Complex colored water-filling gain allocation algorithm: one gain per coefficient case 136

7.4.1. Problem statement and assumptions 136

7.4.2. Optimal gain allocation resulting from minimization of MSWD 137

7.4.3. Implementation 138

7.5. Simulation results 139

7.5.1. cPtNLMS algorithm simulation results 139

7.5.2. cPtAP algorithm simulation results 141

7.6. Transform domain PtNLMS algorithms 144

7.6.1. Derivation 145

7.6.2. Implementation 146

7.6.3. Simulation results 147

7.7. Summary 151

CHAPTER 8. COMPUTATIONAL COMPLEXITY FOR PTNLMS ALGORITHMS 153

8.1. LMS computational complexity 153

8.2. NLMS computational complexity 154

8.3. PtNLMS computational complexity 154

8.4. Computational complexity for specific PtNLMS algorithms 155

8.5. Summary 157

CONCLUSION 159

APPENDIX 1. CALCULATION OF β(0) i , β(1) i,j AND β(2) i 161

APPENDIX 2. IMPULSE RESPONSE LEGEND 167

BIBLIOGRAPHY 169

INDEX 173