Markov chain

Lecture



An example of a circuit with two states

The chain of Markov is a sequence of random events with a finite or countable number of outcomes, characterized by the property that, speaking loosely, with a fixed present, the future is independent of the past. Named in honor of A. A. Markov (senior).

Markov chain with discrete time

Definition

Sequence of discrete random variables Markov chain called a simple Markov chain (with discrete time) if

Markov chain .

Thus, in the simplest case, the conditional distribution of the subsequent state of the Markov chain depends only on the current state and does not depend on all previous states (unlike the Markov chains of higher orders).

Range of values ​​of random variables Markov chain called the chain space , and the number Markov chain - step number.

Transition Matrix and Homogeneous Circuits [edit]

Matrix Markov chain where

Markov chain

is called the matrix of transition probabilities on Markov chain m step and vector Markov chain where

Markov chain

- the initial distribution of the Markov chain.

Obviously, the transition probability matrix is ​​stochastic, that is,

Markov chain .

A Markov chain is said to be single -pitch if the transition probability matrix does not depend on the step number, that is,

Markov chain .

Otherwise, the Markov chain is called inhomogeneous. In the following, we will assume that we are dealing with homogeneous Markov chains.

Finite-dimensional distributions and transition matrix in n steps

From the properties of conditional probability and the definition of a homogeneous Markov chain we get:

Markov chain ,

whence the special case of the Kolmogorov – Chapman equation follows:

Markov chain ,

that is, the transition probability matrix for Markov chain steps of a homogeneous Markov chain Markov chain -th degree of the matrix of transition probabilities for 1 step. Finally,

Markov chain .

Classification of Markov Chain States

  • Returnable condition;
  • Markov return chain;
  • Achievable condition;
  • The indecomposable Markov chain;
  • Periodic state;
  • Periodic Markov chain;
  • Absorbing state;
  • Ergodic condition.

Examples

  • Branching process;
  • Random walk;
  • In the series, 4 numbers (Numb3rs), using the Markov chain as an example, try to uncover the escape of two prisoners. Season 1, Episode 13

Markov chain with continuous time

Definition

Family of discrete random variables Markov chain called a Markov chain (with continuous time) if

Markov chain .

A chain of Markov with continuous time is called homogeneous if

Markov chain .

The matrix of transition functions and the Kolmogorov – Chapman equation

Similar to the discrete time case, the finite-dimensional distributions of a homogeneous Markov chain with continuous time are completely determined by the initial distribution

Markov chain

and the matrix of transition functions ( transition probabilities )

Markov chain .

The matrix of transition probabilities satisfies the Kolmogorov – Chapman equation: Markov chain or

Markov chain

Intensity matrix and Kolmogorov differential equations

By definition, the intensity matrix Markov chain or equivalently

Markov chain .

From the Kolmogorov-Chapman equation, two equations follow:

  • Kolmogorov direct equation

    Markov chain

  • Kolmogorov inverse equation

    Markov chain

For both equations, the initial condition is chosen Markov chain . Appropriate solution Markov chain

Properties of matrices P and Q [edit]

For anyone Markov chain matrix Markov chain has the following properties:

  1. Matrix Elements Markov chain non-negative: Markov chain (nonnegative probability).
  2. The sum of the elements in each row Markov chain equals 1: Markov chain (total probability), that is, the matrix Markov chain is stochastic on the right (or in rows).
  3. All proper numbers Markov chain matrices Markov chain do not exceed 1 in absolute value: Markov chain . If a Markov chain then Markov chain .
  4. Proper number Markov chain matrices Markov chain corresponds to at least one non-negative left eigenvector line (equilibrium): Markov chainMarkov chainMarkov chainMarkov chain .
  5. For own number Markov chain matrices Markov chain all root vectors are proper, that is, the corresponding Jordan cells are trivial.

Matrix Markov chain has the following properties:

  1. Off-diagonal matrix elements Markov chain non-negative: Markov chain .
  2. Diagonal Matrix Elements Markov chain non-positive: Markov chain .
  3. The sum of the elements in each row Markov chain equals 0: Markov chain
  4. Real part of all proper numbers Markov chain matrices Markov chain non-positive: Markov chain . If a Markov chain then Markov chain
  5. Proper number Markov chain matrices Markov chain corresponds to at least one non-negative left eigenvector line (equilibrium): Markov chainMarkov chainMarkov chainMarkov chain
  6. For own number Markov chain matrices Markov chain all root vectors are proper, that is, the corresponding Jordan cells are trivial.

Graph of transitions, connectivity and ergodic Markov chains

For a Markov chain with continuous time, an oriented transition graph (briefly, a transition graph) is constructed according to the following rules:

  • The set of vertices of the graph coincides with the set of chain states.
  • Vertices Markov chain connected by oriented edge Markov chain , if a Markov chain (i.e. the flow rate from Markov chain th state in Markov chain is positive.

Topological properties of the transition graph associated with the spectral properties of the matrix Markov chain . In particular, the following theorems are true for finite Markov chains:

  • The following three properties of A, B, and B of a finite Markov chain are equivalent (the chains possessing them are sometimes called weakly ergodic ):

A. For any two different vertices of the transition graph Markov chain there is such a vertex Markov chain a graph (“common drain”) that there are oriented paths from the top Markov chain to the top Markov chain and from the top Markov chain to the top Markov chain . Note : possible case Markov chain or Markov chain ; in this case, the trivial (empty) path from Markov chain to Markov chain or from Markov chain to Markov chain also considered an oriented way.

B. Zero eigenvalue of the matrix Markov chain nondegenerate.

B. When Markov chain matrix Markov chain tends to the matrix, in which all the rows coincide (and coincide, obviously, with the equilibrium distribution).

  • The following five properties A, B, C, D, D of a finite Markov chain are equivalent (the chains possessing them are called ergodic ):

A. The transition graph of a chain is oriented.

B. Zero eigenvalue of the matrix Markov chain is non-degenerate and corresponds to a strictly positive left eigenvector (equilibrium distribution).

B. For some Markov chain matrix Markov chain strictly positive (i.e. Markov chain for all Markov chain ).

G. For all Markov chain matrix Markov chain strictly positive.

D. When Markov chain matrix Markov chain tends to a strictly positive matrix, in which all the rows coincide (and obviously coincide with the equilibrium distribution).

Examples

Markov chain

Fig. Examples of transition graphs for Markov chains: a) the chain is not weakly ergodic (there is no common flow for states Markov chain ); b) weakly ergodic, but not ergodic chain (transition graph is not oriented connected); c) ergodic chain (transition graph orientedly connected).

Let us consider Markov chains with three states and with continuous time, corresponding to the transition graphs shown in Fig. In case (a), only the following nondiagonal elements of the intensity matrix are nonzero: Markov chain in case (b) are non-zero only Markov chain , and in case (c) - Markov chain . The remaining elements are determined by the properties of the matrix. Markov chain (the sum of the elements in each row is 0). As a result, for graphs (a), (b), (c), the intensity matrices are: Markov chainMarkov chainMarkov chain

Basic kinetic equation

The basic kinetic equation describes the evolution of the probability distribution in a Markov chain with continuous time. The “basic equation” here is not an epithet, but a translation of the term English. Master equation . For a probability distribution vector string Markov chain the basic kinetic equation is:

Markov chain

and coincides, essentially, with the direct Kolmogorov equation. In the physical literature, probability column vectors are used more often and the basic kinetic equation is written in the form that explicitly uses the law of conservation of total probability:

Markov chain

Where Markov chain

If for the basic kinetic equation there is a positive equilibrium Markov chain then it can be written in the form

Markov chain

Lyapunov functions for the main kinetic equation

For the basic kinetic equation, there exists a rich family of convex Lyapunov functions — monotonously varying with time distribution probability functions. Let be Markov chain - convex function of one variable. For any positive probability distribution ( Markov chain ) we define the function Morimoto Markov chain :

Markov chain .

Derivative Markov chain on time if Markov chain satisfies the basic kinetic equation, there is

Markov chain .

The last inequality holds because of the bulge Markov chain .

Examples of Morimoto Functions Markov chain [edit]

  • Markov chain , Markov chain ;

this function is the distance from the current probability distribution to the equilibrium in Markov chain -norm The time shift is a contraction of the space of probability distributions in this norm. (For compression properties, see the Banach Fixed Point Theorem article.)

  • Markov chain , Markov chain ;

this function is (minus) Kullback entropy (see Kullback – Leibler Distance). In physics, it corresponds to the free energy divided by Markov chain (Where Markov chain —Permanent Boltzmann, Markov chain - absolute temperature):

if a Markov chain (Boltzmann distribution), then

Markov chain .

  • Markov chain , Markov chain ;
This function is an analogue of the free energy for Burg entropy, widely used in signal processing:

Markov chain

  • Markov chain , Markov chain ;

this is a quadratic approximation for the (minus) Kullback entropy near the equilibrium point. Up to a time-constant term, this function coincides with the (minus) Fisher entropy, which is given by the following choice,

  • Markov chain , Markov chain ;

this is (minus) Fisher entropy.

  • Markov chain , Markov chain ;

This is one of the analogues of free energy for the entropy of Tsallis. Tsallis entropy

Markov chain

serves as the basis for the statistical physics of nonextensive quantities. With Markov chain it tends to the classical Boltzmann – Gibbs – Shannon entropy, and the corresponding Morimoto function to the (minus) Kullback entropy.

created: 2015-01-03
updated: 2021-03-13
133068



Rating 9 of 10. count vote: 2
Are you satisfied?:



Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

probabilistic processes

Terms: probabilistic processes