Information Theoretic Learning Renyi's Entropy and Kernel Perspectives /
This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, cor...
Main Author: | |
---|---|
Corporate Author: | |
Format: | Electronic |
Language: | English |
Published: |
New York, NY :
Springer New York : Imprint: Springer,
2010.
|
Series: | Information Science and Statistics,
|
Subjects: | |
Online Access: | https://ezaccess.library.uitm.edu.my/login?url=http://dx.doi.org/10.1007/978-1-4419-1570-2 |
Table of Contents:
- Information theory, machine learning and reproducing kernel Hilbert spaces
- Renyi<U+0019>s entropy, divergence and their nonparametric estimators
- Adaptive information filtering with error entropy and error correntropy criteria
- Algorithms for entropy and correntropy adaptation with applications to linear systems
- Nonlinear adaptive filtering with MEE, MCC and applications
- Classification with EEC, divergence measures and error bounds
- Clustering with ITL principles
- Self-organizing ITL principles for unsupervised learning
- A reproducing kernel Hilbert space framework for ITL
- Correntropy for random variables: properties, and applications in statistical inference
- Correntropy for random processes: properties, and applications in signal processing
- Appendix A: PDF estimation methods and experimental evaluation of ITL descriptors.