On ϕ-Divergence and Its Applications

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

The Shannon entropy and the associated Kullback---Leibler divergence measure (relative entropy) between probability measures are fundamental from the applications point of view, and arise naturally from statistical concepts. While restricted to information theory and statistics in the work of Shannon [38] and Kullback---Leibler [31] in the early 1950s, the concept of entropy started to be used in optimization modeling for various problems of engineering and management science. An early work on entropy optimization problems over linear constraints sets (equality or inequality) was studied by Chames and Cooper [16] via convex programming techniques. Many other useful applications in a diversity of problems such as traffic engineering, game theory, information theory, and marketing were developed by Chames et al. (see, e.g., [15,17-19] and the references therein). For further general information and applications, we refer the reader to Frieden [23] and Kay and Marple [30] for engineering problems and to Lev and Theil [32] and more recently to Theil and Feibig [39] for economic and finance models.
Original languageUndefined/Unknown
Title of host publicationSystems and Management Science by Extremal Methods: Research Honoring Abraham Charnes at Age 70
EditorsFred Young Phillips, John James Rousseau
Place of PublicationBoston, MA
PublisherSpringer US
Pages255-273
Number of pages19
ISBN (Print)978-1-4615-3600-0
DOIs
StatePublished - 1992

Cite this