Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Quasi-continuous maximum entropy distribution approximation with kernel density

Quasi-continuous maximum entropy distribution approximation with kernel density This paper extends maximum entropy estimation of discrete probability distributions to the continuous case. This transition leads to a non-parametric estimation of a probability density function, preserving the maximum entropy principle. Furthermore, the derived density estimate provides a minimum mean integrated square error. In the second step, it is shown how boundary conditions can be included, resulting in a probability density function obeying maximum entropy. The criterion for deviation from a reference distribution is the Kullback-Leibler entropy. It is further shown, how the characteristics of a particular distribution can be preserved by using integration kernels with mimetic properties. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png International Journal of Information and Decision Sciences Inderscience Publishers

Quasi-continuous maximum entropy distribution approximation with kernel density

Loading next page...
 
/lp/inderscience-publishers/quasi-continuous-maximum-entropy-distribution-approximation-with-kd1tDiKd7p

References (72)

Publisher
Inderscience Publishers
Copyright
Copyright © Inderscience Enterprises Ltd. All rights reserved
ISSN
1756-7017
eISSN
1756-7025
DOI
10.1504/IJIDS.2011.043026
Publisher site
See Article on Publisher Site

Abstract

This paper extends maximum entropy estimation of discrete probability distributions to the continuous case. This transition leads to a non-parametric estimation of a probability density function, preserving the maximum entropy principle. Furthermore, the derived density estimate provides a minimum mean integrated square error. In the second step, it is shown how boundary conditions can be included, resulting in a probability density function obeying maximum entropy. The criterion for deviation from a reference distribution is the Kullback-Leibler entropy. It is further shown, how the characteristics of a particular distribution can be preserved by using integration kernels with mimetic properties.

Journal

International Journal of Information and Decision SciencesInderscience Publishers

Published: Jan 1, 2011

There are no references for this article.