Monotonicity of the Trace-Inverse of Covariance Submatrices and Two-Sided Prediction

Research output: Contribution to journalArticlepeer-review

Abstract

It is common to assess the 'memory strength' of a stationary process by looking at how fast the normalized log-determinant of its covariance submatrices (i.e., entropy rate) decreases. In this work, we propose an alternative characterization in terms of the normalized trace-inverse of the covariance submatrices. We show that this sequence is monotonically non-decreasing and is constant if and only if the process is white. Furthermore, while the entropy rate is associated with one-sided prediction errors (present from past), the new measure is associated with two-sided prediction errors (present from past and future). Minimizing this measure is then used as an alternative to Burg's maximum-entropy principle for spectral estimation. We also propose a counterpart for non-stationary processes, by looking at the average trace-inverse of subsets.

Original languageEnglish
Pages (from-to)2767-2781
Number of pages15
JournalIEEE Transactions on Information Theory
Volume68
Issue number4
DOIs
StatePublished - 1 Apr 2022

Funding

FundersFunder number
Israel Ministry of Economy and Industry
Israel Science Foundation2427/19, 2623/20, 2077/20

    Keywords

    • Maximum entropy
    • causality
    • minimum mean square error
    • prediction

    Fingerprint

    Dive into the research topics of 'Monotonicity of the Trace-Inverse of Covariance Submatrices and Two-Sided Prediction'. Together they form a unique fingerprint.

    Cite this