19.7. Markov random process

Lecture



The assumptions about the Poisson nature of the flow of requests and the exponential distribution of service time are valuable in that they allow the use of the apparatus of so-called Markovian random processes in the theory of mass service.

A process that takes place in a physical system is called a Markov process (or a process without an after-effect), if for each time point the probability of any state of the system in the future depends only on the state of the system at the moment   19.7.  Markov random process and does not depend on how the system came to this state.

Consider an elementary example of a Markov random process. Abscissa axis   19.7.  Markov random process point randomly moves   19.7.  Markov random process . At the moment of time   19.7.  Markov random process point   19.7.  Markov random process is at the origin   19.7.  Markov random process and stays there for one second. A second later, a coin rushes; if the coat of arms fell - point   19.7.  Markov random process moves one unit of length to the right, if the digit is to the left. After a second, the coin rushes back and the same random movement is made, etc. The process of changing the position of a point (or, as they say, “wandering”) is a random process with discrete time   19.7.  Markov random process and a countable set of states

  19.7.  Markov random process

A diagram of possible transitions for this process is shown in Fig. 19.7.1.

  19.7.  Markov random process

Fig. 19.7.1.

Let us show that this process is Markov. Indeed, imagine that at some point in time   19.7.  Markov random process the system is, for example, able   19.7.  Markov random process - one unit to the right of the origin. Possible positions of a point in a unit of time will be   19.7.  Markov random process and   19.7.  Markov random process with probabilities 1/2 and 1/2; after two units -   19.7.  Markov random process ,   19.7.  Markov random process ,   19.7.  Markov random process with probabilities 1/4, ½, 1/4 and so on. Obviously, all these probabilities depend only on where the point is at the moment.   19.7.  Markov random process , and completely independent of how she came there.

Consider another example. There is a technical device   19.7.  Markov random process consisting of elements (details) types   19.7.  Markov random process and   19.7.  Markov random process with different durability. These elements can fail at random times and independently of each other. The proper operation of each element is absolutely necessary for the operation of the device as a whole. The uptime of the element is a random variable distributed according to the exponential law; for items of type   19.7.  Markov random process and   19.7.  Markov random process the parameters of this law are different and equal respectively   19.7.  Markov random process and   19.7.  Markov random process . In the event of a device failure, measures are taken immediately to identify the causes and the detected faulty element is immediately replaced with a new one. The time required to restore (repair) the device is distributed exponentially with the parameter   19.7.  Markov random process (if an element of type has failed   19.7.  Markov random process ) and   19.7.  Markov random process (if an element of type has failed   19.7.  Markov random process ).

In this example, the random process occurring in the system is a Markov process with continuous time and a finite set of states:

  19.7.  Markov random process - all elements are intact, the system works,

  19.7.  Markov random process - defective item type   19.7.  Markov random process , the system is being repaired,

  19.7.  Markov random process - defective item type   19.7.  Markov random process The system is being repaired.

The scheme of possible transitions is given in fig. 19.7.2.

  19.7.  Markov random process

Fig. 19.7.2.

Indeed, the process has a Markov property. Let for example, at the moment   19.7.  Markov random process system is able to   19.7.  Markov random process (OK) Since the uptime of each element is indicative, the time of failure of each element in the future does not depend on how long it has been working (when delivered). Therefore, the probability that the system will remain in the future   19.7.  Markov random process or leave it does not depend on the "background" of the process. Suppose now that at the moment   19.7.  Markov random process system is able to   19.7.  Markov random process (defective item type   19.7.  Markov random process ). Since the repair time is also indicative, the probability of the end of the repair at any time after   19.7.  Markov random process does not depend on when the repair began and when the rest (serviceable) elements were delivered. Thus, the process is Markov.

Note that the exponential distribution of the element operation time and the exponential distribution of the repair time are essential conditions, without which the process would not be Markovian. Indeed, let us suppose that the time of a good operation of an element is distributed not according to the exponential law, but according to some other - for example, according to the law of uniform density in the area   19.7.  Markov random process . This means that each item with a guarantee of working time.   19.7.  Markov random process , and on the plot from   19.7.  Markov random process before   19.7.  Markov random process can fail at any time with the same probability density. Suppose that at some point in time   19.7.  Markov random process The item is working properly. Obviously, the likelihood that an element will fail at some time in the future depends on how long the element is delivered, that is, it depends on history, and the process will not be Markovian.

The situation is the same with the repair time.   19.7.  Markov random process ; if it is not exponential and the element is at the moment   19.7.  Markov random process repaired, the remaining repair time depends on when it started; the process will not be Markov again.

In general, the exponential distribution plays a special role in the theory of Markov random processes with continuous time. It is easy to verify that in a stationary Markov process the time during which the system remains in any state is always distributed according to the exponential law (with a parameter depending, generally speaking, on this state). Indeed, suppose that at the moment   19.7.  Markov random process system is able to   19.7.  Markov random process and before that it had been there for some time. According to the definition of the Markov process, the probability of any event in the future does not depend on the prehistory; in particular, the probability that the system will leave the state   19.7.  Markov random process for a time   19.7.  Markov random process , should not depend on how much time the system has already spent in this state. Therefore, the time the system is in   19.7.  Markov random process should be distributed according to the indicative law.

In the case when the process proceeding in a physical system with a countable set of states and continuous time is Markov, this process can be described using ordinary differential equations, in which the unknown functions are the probabilities of the states   19.7.  Markov random process . The compilation and solution of such equations will be demonstrated in the following   19.7.  Markov random process on the example of the simplest queuing system.


Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

Queuing theory

Terms: Queuing theory