Table of Contents
Information Theory, Machine Learning, and Reproducing Kernel Hilbert Spaces.- Renyi’s Entropy, Divergence and Their Nonparametric Estimators.- Adaptive Information Filtering with Error Entropy and Error Correntropy Criteria.- Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems.- Nonlinear Adaptive Filtering with MEE, MCC, and Applications.- Classification with EEC, Divergence Measures, and Error Bounds.- Clustering with ITL Principles.- Self-Organizing ITL Principles for Unsupervised Learning.- A Reproducing Kernel Hilbert Space Framework for ITL.- Correntropy for Random Variables: Properties and Applications in Statistical Inference.- Correntropy for Random Processes: Properties and Applications in Signal Processing.