IISc Logo    Title

etd AT Indian Institute of Science >
Division of Electrical Sciences >
Electrical Communication Engineering (ece) >

Please use this identifier to cite or link to this item: http://etd.iisc.ernet.in/2005/2649

Title: Minimization Problems Based On A Parametric Family Of Relative Entropies
Authors: Ashok Kumar, M
Advisors: Sundaresan, Rajesh
Keywords: Information Theory
Information Geometry
Kullback-Leiber Divergence
Linear Entropy
Power-law Family
Tsallis Entropy
Pythagorean Property
Relative Entropy
Renyi Entropy
Exponential Family
Relative Entropy Minimization
Ropbust Statistics
Information Projection
Parametric Family
Relative Entropies
Submitted Date: May-2015
Series/Report no.: G26742
Abstract: We study minimization problems with respect to a one-parameter family of generalized relative entropies. These relative entropies, which we call relative -entropies (denoted I (P; Q)), arise as redundancies under mismatched compression when cumulants of compression lengths are considered instead of expected compression lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative -entropies behave like squared Euclidean distance and satisfy the Pythagorean property. We explore the geometry underlying various statistical models and its relevance to information theory and to robust statistics. The thesis consists of three parts. In the first part, we study minimization of I (P; Q) as the first argument varies over a convex set E of probability distributions. We show the existence of a unique minimizer when the set E is closed in an appropriate topology. We then study minimization of I on a particular convex set, a linear family, which is one that arises from linear statistical constraints. This minimization problem generalizes the maximum Renyi or Tsallis entropy principle of statistical physics. The structure of the minimizing probability distribution naturally suggests a statistical model of power-law probability distributions, which we call an -power-law family. Such a family is analogous to the exponential family that arises when relative entropy is minimized subject to the same linear statistical constraints. In the second part, we study minimization of I (P; Q) over the second argument. This minimization is generally on parametric families such as the exponential family or the - power-law family, and is of interest in robust statistics ( > 1) and in constrained compression settings ( < 1). In the third part, we show an orthogonality relationship between the -power-law family and an associated linear family. As a consequence of this, the minimization of I (P; ), when the second argument comes from an -power-law family, can be shown to be equivalent to a minimization of I ( ; R), for a suitable R, where the first argument comes from a linear family. The latter turns out to be a simpler problem of minimization of a quasi convex objective function subject to linear constraints. Standard techniques are available to solve such problems, for example, via a sequence of convex feasibility problems, or via a sequence of such problems but on simpler single-constraint linear families.
Abstract file URL: http://etd.ncsi.iisc.ernet.in/abstracts/3459/G26742-Abs.pdf
URI: http://etd.iisc.ernet.in/handle/2005/2649
Appears in Collections:Electrical Communication Engineering (ece)

Files in This Item:

File Description SizeFormat
G26742.pdf752.61 kBAdobe PDFView/Open

Items in etd@IISc are protected by copyright, with all rights reserved, unless otherwise indicated.


etd@IISc is a joint service of SERC & IISc Library ||
|| Powered by DSpace || Compliant to OAI-PMH V 2.0 and ETD-MS V 1.01