Abstract
Examples are provided of Markovian martingales that: (i) converge in distribution but fail to converge in probability; (ii) converge in probability but fail to converge almost surely. This stands in sharp contrast to the behavior of series with independent increments, and settles, in the negative, a question raised by Loeve in 1964. Subsequently, it is proved that a discrete, real-valued Markov-chain with stationary transition probabilities, which is at the same time a martingale, converges almost surely if it converges in distribution, provided the limiting measure has a mean. This fact does not extend to non-discrete processes.
Original language | English |
---|---|
Pages (from-to) | 1374 - 1379 |
Journal | The Annals of Mathematical Statistics |
Volume | 43 |
Issue number | 4 |
DOIs | |
State | Published - 1972 |
Externally published | Yes |