American Journal of Computational and Applied Mathematics
p-ISSN: 2165-8935 e-ISSN: 2165-8943
2016; 6(1): 7-13
doi:10.5923/j.ajcam.20160601.02

Michael Gr. Voskoglou
Professor Emeritus of Mathematical Sciences, Graduate Technological Educational Institute of Western Greece, School of Technological Applications, Greece
Correspondence to: Michael Gr. Voskoglou, Professor Emeritus of Mathematical Sciences, Graduate Technological Educational Institute of Western Greece, School of Technological Applications, Greece.
| Email: | ![]() |
Copyright © 2016 Scientific & Academic Publishing. All Rights Reserved.
This work is licensed under the Creative Commons Attribution International License (CC BY).
http://creativecommons.org/licenses/by/4.0/

Markov chains offer ideal conditions for the study and mathematical modelling of a certain kind of situations depending on random variables. The basic concepts of the corresponding theory were introduced by Markov in 1907 on coding literary texts. Since then, the Markov chain theory was developed by a number of leading mathematicians, such as Kolmogorov, Feller etc. However, only from the 60’s the importance of this theory to the Natural, Social and most of the other Applied Sciences has been recognized. In this review paper we present applications of finite Markov chains to Management problems, which can be solved, as most of the problems concerning applications of Markov chains in general do, by distinguishing between two types of such chains, the ergodic and the absorbing ones.
Keywords: Stochastic Models, Finite Markov Chains, Ergodic Chains, Absorbing Chains
Cite this paper: Michael Gr. Voskoglou, Applications of Finite Markov Chain Models to Management1, American Journal of Computational and Applied Mathematics , Vol. 6 No. 1, 2016, pp. 7-13. doi: 10.5923/j.ajcam.20160601.02.
. Denote by pij the transition probability from state si to state sj , i, j = 1, 2,…, n ; then the matrix A= [pij] is called the transition matrix of the Chain. Since the transition from a state to some other state (including itself) is a certain event, we have that pi1 + pi2 +….. + pin = 1, for i=1, 2, …, n.The row-matrix Pk = [p1(k) p2(k)… pn(k)], known as the probability vector of the Chain, gives the probabilities pi(k) for the chain to be in state i at step k , for i = 1, 2,…., n and k = 0, 1, 2,…. We obviously have again that p1(k) + p2(k) + …. + pn(k) = 1. The following well known Proposition enables one to make short run forecasts for the evolution of various situations that can be represented by a finite Markov Chain. The proof of this Proposition is also sketched, just to show to the non expert on the subject reader the strict connection between Markov Chains and Probability (actually Markov chains is considered to be a topic of Probability theory).
while one (and only one) of the events E1, E2, …, En always occurs.. Therefore, by the total probability formula ([7], Chapter 3) we have that P(E) =
, where
denote the corresponding conditional probabilities. But P(E) = p1(1), P(Ei) = pi(0) and P(
) = pi1. Therefore, p1(1) = p1(0)p11 + p2(0)p21 + …. + pn(0)pn1 and in the same way pi(1) = p1(0)p1i + p2(0)p2i + … + pn(0)pni , i = 1, 2,…, n (2). Writing the system (2) in matrix form we obtain (1) and working similarly we can show that in general we have Pk+1= PkA, for all non negative integers k.
Further, since K circulates for first time in the market, we have that P0 = [0 1], therefore P2=P0A2 = [0,3 0,7].Thus the market’s share for K two weeks after its first circulation will be 30%.
, which gives p1=0.7p1+0.2p2 and p2=0.3p1+0.8p2, or equivalently that 0.3p1 – 0.2p2=0. Solving the linear system of the above equation and of p1 + p2=1 one finds that p1=0.4, i.e. the market’s share for K in the long run will be 40%.The next problem concerns the application of a 3-state Ergodic Markov Chain to the production process of an industry:
Let P = [p1 p2 p3] be the limiting probability vector, then the equality P = PA gives that p1 = 0.4p3, p2 = 0.4p1 + 0.4p2 + 0.6p3, and p3 = 0.6p1 + 0.6p2. Adding the first two of the above equations we find the third one. Solving the linear system of the first two equations and of p1 + p2 + p3 = 1 one finds that p1 = 0.15. Therefore the probability to have unsatisfied orders in the long run is 15%.
, we bring its transition matrix A to its canonical form A* by listing the absorbing states first and then we make a partition of A* of the form
where I is the unitary k X k matrix, O is a zero matrix, R is the (n – k) X k transition matrix from the non absorbing to the absorbing states and Q is the (n – k) X (n – k) transition matrix between the non absorbing states. Denote by In – k the unitary (n – k) X (n – k) matrix; it can be shown ([4], Chapter 3) that the square matrix In – k - Q has always a non zero determinant. Then, the fundamental matrix of the Absorbing Chain is defined to be the matrix ![]() | (4) |
after a straightforward calculation, that
. Thus, since in this case the Chain is always starting from state s1, the mean number of times in states s1 and s2 before the absorption are 1.25 and in state s3 is 1. Therefore, the mean time needed for the completion of the whole process is 1.25 *(10+4) + 3 + 45 = 65.5 days. When an Absorbing Markov Chain has more than one absorbing states, then the element bij of the matrix B = NR = [bij] gives the probability for the Chain starting in state si to be absorbed in state si ([4], Chapter 3). This is illustrated in the following example, which is a special case of a “random – walk” problem:
Then, it is straightforward to check that the fundamental matrix of the Chain is
Thus, since the truck starts its route from the storehouse A2 (state s3), the mean number of its stops to the storehouseA1 (state s2) is 1, to the storehouse A2 (state s3) is 2 and to the storehouse A3 (state s4) is 1.
Thus the probability for the truck to terminate its route to the city C1 (state s1), when it starts it from store A2 (state s3) is 50%. Our last application illustrates the fact that a great care is needed sometimes in order to “translate” correctly the mathematical results of the Markov Chain model in terms of the corresponding real situation.
Further, using a PC mathematical package to make the necessary calculations quicker, it is straightforward to check that the fundamental matrix of the chain is
Observing the fundamental matrix N of the chain, one finds that n13=0.729 and n14=0.521, i.e. for a first year student of the college the mean time of attendance in the third and fourth year of studies is less than one year! However this is not embarrassing, because there is always a possibility for a student to be withdrawn from the college due to unsatisfactory performance before entering the third, or fourth, year of studies.Since n11 = n22 = n33 = n44 = 1.429, we find that the mean time of attendance of a student in each year of studies is 1.429 years, while the mean time needed for his/her graduation is 1.429 * 4 = 5.716 years.Further, observing the matrix B one finds that b15 = 0.74, i.e. the probability of a student to graduate is 74%.