An Information Theory Approach to Deconvolution
, by Bob Godfrey
The Kullback information measure is a scale invariant measure of the amount of information available for
discrimination between two distributions. Using Pearson's system of probability density functions to
parameterize a time series, the Kullback measure is used to derive a norm measuring the information content
of a time series relative to a Gaussian distribution. Maximizing the norm drives the times series relative to
a Gaussian distribution. Maximizing the norm drives the time series away from Gaussian, the only assumption
being that the current distribution of the time series can be represented by a particular system of Pearson
densities. The maximization procedure used was steepest descent, and the above norm is used to derive a
gradient.
The generalized Gaussian (see Gray, p. 123, this report) and Pearson system both can be used to parameterize
the probability density function of a time series. A brief comparision between the two was made and the
generalized Gaussian was found to have lower RMS error and lower maximum error than the Pearson.
A general method of parameterizing probability density functions using quantiles is presented in the appendix.
Although results are almost identical to those obtained using a Cauchy parameterization, the method has the
potential of being able to parameterzie multi-modal distributions.