Convergence in Distribution, Convergence in Probability and Almost Sure Convergence of Discrete Martingales

Research output: Contribution to journalArticlepeer-review

Abstract

Examples are provided of Markovian martingales that: (i) converge in distribution but fail to converge in probability; (ii) converge in probability but fail to converge almost surely. This stands in sharp contrast to the behavior of series with independent increments, and settles, in the negative, a question raised by Loeve in 1964. Subsequently, it is proved that a discrete, real-valued Markov-chain with stationary transition probabilities, which is at the same time a martingale, converges almost surely if it converges in distribution, provided the limiting measure has a mean. This fact does not extend to non-discrete processes.
Original languageEnglish
Pages (from-to)1374 - 1379
JournalThe Annals of Mathematical Statistics
Volume43
Issue number4
DOIs
StatePublished - 1972
Externally publishedYes

Fingerprint

Dive into the research topics of 'Convergence in Distribution, Convergence in Probability and Almost Sure Convergence of Discrete Martingales'. Together they form a unique fingerprint.

Cite this