The Shannon entropy and the associated Kullback---Leibler divergence measure (relative entropy) between probability measures are fundamental from the applications point of view, and arise naturally from statistical concepts. While restricted to information theory and statistics in the work of Shannon  and Kullback---Leibler  in the early 1950s, the concept of entropy started to be used in optimization modeling for various problems of engineering and management science. An early work on entropy optimization problems over linear constraints sets (equality or inequality) was studied by Chames and Cooper  via convex programming techniques. Many other useful applications in a diversity of problems such as traffic engineering, game theory, information theory, and marketing were developed by Chames et al. (see, e.g., [15,17-19] and the references therein). For further general information and applications, we refer the reader to Frieden  and Kay and Marple  for engineering problems and to Lev and Theil  and more recently to Theil and Feibig  for economic and finance models.
|Title of host publication||Systems and Management Science by Extremal Methods: Research Honoring Abraham Charnes at Age 70|
|Editors||Fred Young Phillips, John James Rousseau|
|Place of Publication||Boston, MA|
|Number of pages||19|
|State||Published - 1992|