On the use of data compression measures to analyze robust designs

Research output: Contribution to journalArticlepeer-review


In this paper, we suggest a potential use of data compression measures, such as the Entropy, and the Huffman Coding, to assess the effects of noise factors on the reliability of tested systems. In particular, we extend the Taguchi method for robust design by computing the entropy of the percent contribution values of the noise factors. The new measures are computed already at the parameter-design stage, and together with the traditional S/N ratios enable the specification of a robust design. Assuming that (some of) the noise factors should be naturalized, the entropy of a design reflects the potential efforts that will be required in the tolerance-design stage to reach a more reliable system. Using a small example, we illustrate the contribution of the new measure that might alter the designer decision in comparison with the traditional Taguchi method, and ultimately obtain a system with a lower quality loss. Assuming that the percent contribution values can reflect the pro bability of a noise factor to trigger a disturbance in the system response, a series of probabilistic algorithms can be applied to the robust design problem. We focus on the Huffman coding algorithm, and show how to implement this algorithm such that the designer obtains the minimal expected number of tests in order to find the disturbing noise factor. The entropy measure, in this case, provides the lower bound on the algorithm's performance.

Original languageEnglish
Pages (from-to)381-388
Number of pages8
JournalIEEE Transactions on Reliability
Issue number3
StatePublished - Sep 2005


  • Compression rate
  • Control & noise factors
  • Entropy
  • Experimentation
  • Performance measure
  • Robust designs
  • S/N ratio
  • Taguchi method


Dive into the research topics of 'On the use of data compression measures to analyze robust designs'. Together they form a unique fingerprint.

Cite this