What is Markov chain: Definition and 97 Discussions

A Markov chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC). It is named after the Russian mathematician Andrey Markov.
Markov chains have many applications as statistical models of real-world processes, such as studying cruise control systems in motor vehicles, queues or lines of customers arriving at an airport, currency exchange rates and animal population dynamics.Markov processes are the basis for general stochastic simulation methods known as Markov chain Monte Carlo, which are used for simulating sampling from complex probability distributions, and have found application in Bayesian statistics, thermodynamics, statistical mechanics, physics, chemistry, economics, finance, signal processing, information theory and speech processing.The adjectives Markovian and Markov are used to describe something that is related to a Markov process.

View More On Wikipedia.org
  1. fluidistic

    Probability of Finding a Random Walker in D Dimensions After N Steps

    Homework Statement Hi guys, I'm absolutely desperate on the following problem: Consider a random walker who can make steps only to neighbor sites in "D" dimensions where D is an arbitrary natural number. Assume that the distance between 2 adjacent sites is the same for all sites and that the...
  2. W

    Learning Markov Chain: Clarifying Persistent vs Regular

    Hi all, I am trying to understand the concept of Markov Chain (a scan copy of 2 pages is attached), before this text I already studied the notes on Markov Chains at: http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter11.pdf I am lil' confused at the...
  3. A

    Markov Chain aggregation method

    I am using a Markov Chain to get the 10 best search results from the union of 3 different search engines. The top 10 results are taken from each engine to form a set of 30 results. The chain starts at State x, a uniform distribution of set S = {1,2,3,...30}. If the current state is page i...
  4. M

    How is Ising Model a Markov Chain?

    The title says it all. It looks like the configuration probability only depends on where you want to go, not what state you are in now. Yet when I watch simulations, there is clearly a dependence on the previous state. Is there something pretty basic I'm misunderstanding about configuration...
  5. G

    Stationary distribution of a Markov chain

    Homework Statement find the stationary distribtion of ##\left( \begin{array}{ccccc} \frac{1}{2} & \frac{1}{2} & 0 & 0 & 0 \\ \frac{1}{3} & \frac{2}{3} & 0 & 0 & 0 \\ 0 & \frac{1}{3} & \frac{2}{3} & 0 & 0 \\ 0 & 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 1 & 0 \end{array} \right)##Homework Equations...
  6. L

    What Are the Markov Chain Probabilities for a Taxicab's Moves?

    Homework Statement A taxicab moves between the airport, Hotel A, and Hotel B according to a Markov chain with transition probabilities: P(airport → A) = 0.7, P(airport → B) = 0.3, P(A → airport) = 0.9, P(A → B) = 0.1, P(B → airport) = 0.8, P(B → A) = 0.2. A-If the taxicab starts...
  7. R

    Find all Invariant Probability Measures for P (Markov Chain)

    Homework Statement Find all Invariant Probability Measures for P (Markov Chain) E = {1,2,3,4,5} The screenshot below has P and my attempted solution. I am wondering if it acceptable to have infinitely many answers ("all" seems to indicate that is acceptable). Basically, I had too many unknowns...
  8. R

    Find Lambda for Simple Walk on Z (Markov Chain)

    Homework Statement Find \lambda= \{\lambda_i\} with \lambda_0 = 1 if P(X_n=i+1|X_n=i) = p, P(X_n = i-1|X_n = i) = 1 - p, i \in Z, 0<p<1 Homework Equations \lambda P = \lambda The Attempt at a Solution I used my relevant equation to write out: (1-p)\lambda_0 + P\lambda_{-2} =...
  9. H

    Concept question on irreducibility (markov chain)

    Homework Statement This is not really a homework question but just something that I'm confused about. I'm having trouble with understanding the movement of Markov chain. Say, if S = {0,1,2...} and 0<q<1. Let p(i,i+1)=q and p(i,i-1)=1-q. I can see how this Markov chain moves (back and forth...
  10. N

    [markov chain] prove that probability equals 6/pi²

    Homework Statement Homework Equations N/A The Attempt at a Solution I'll shortly explain what my reasoning is so far, but please ignore it if it comes across too jumbled: --- Let P denote the markov matrix associated with this problem, then I think I was able to argue that the...
  11. S

    Markov chain, sum of N dice rolls

    Question : Let Xn be the maximum score obtained after n throws of a fair dice a) Prove that Xn is a markov chain and write down the transition matrix Im having a problem starting the transition matrix im assuming the states are meant to be the sum. then do you write out the transition...
  12. J

    Solving Markov Chain Question on Two Switches

    Hi, I need help with answering this question. Firstly, I'm not sure what the transition matrix should like. Should there be 2 states? One where both switches are off and one where both switches are on? The question is: Suppose that each of 2 switches is either on or off during the day. On...
  13. J

    Gamblers Ruin - Markov Chain problem

    This is probably a noob question, background in probability theory isn't great but I was shown this problem in a lecture: "Suppose a gambler starts out with £n, and makes a series of £1 bets against the house. Let the probability of winning each bet be p, and of loosing be q = 1 − p. If...
  14. T

    Why Do Lines 3 and 4 Equate in Random Walk Probability Calculations?

    Suppose X is a random walk with probability P(X_k=+1)=p and P(X_k=-1)=q=1-p and S_n=X_1+X_2+...+X_n Can anyone explain why does line 3 equal to line 4? P(S_k-S_0≠0 ,S_k-S_1≠0 ,…,S_k-S_{k-1}≠0) =P(X_k+X_{k-1}+⋯+X_1≠0 ,X_k+X_{k-1}+⋯+X_2≠0 ,…,X_k≠0) =P( X_k≠0 ,X_k+X_{k-1}≠0...
  15. L

    How to Generate a 100 State Sequence Using a Markov Chain Model?

    I have a state transition probability matrix and a state probability vector [0.9 0.1; 0.1 0.9] & [0.4 0.6] respectively. Now, I want to generate the states of 1 and 0 according to this. say 100 state sequence. Any sort of help would be appreciated. Thanks.
  16. X

    Persistence of State i in a Markov Chain

    In a Markov chain, show that a state i is persistent if and only if the mean number of visits to the state i is infinite given the chain started in state i. I thought about looking at the mean recurrence time, but that's all I have so far.
  17. X

    Show all other states are transient in Markov chain

    Let X be a Markov chain with a state s that is absorbing, i.e. pss(1) = 1. All other states communicate with s i.e. i → s for all states i ∈ S. Show that all states in S except s are transient. I understand this intuitively, but I'm not really sure how to start the proof.
  18. S

    Any fast way to compute the fixed vector of a Markov chain transistion matrix?

    * I have already posted this in the General Math, but I guess the problem is more like a linear algebra problem. Currently I am using a rather simple way, to solve vector w from (M-I)w=0 (replace one equation by w1+w2+...wn=1). Is there any faster way to do this? Thank you.
  19. J

    Why Is My Markov Chain Simulation Producing Unexpected Output?

    Homework Statement Homework Equations Seems pretty straightforward ... The Attempt at a Solution Here's what I put: P = zeros(10,10); P(1,2) = 1; P(2,1) = 1/2; P(2,3) = 1/2; P(3,1) = 1/2; P(3,4) = 1/2; P(4,1) = 1/3; P(4,2) = 1/3; P(4,5) = 1/3...
  20. J

    Do Markov Chain Transition Matrices Sum by Row or Column?

    Homework Statement Homework Equations The Attempt at a Solution A few things. First of all, the homework problem notes that "all the columns should sum to 1," whereas Wikipedia says ∑Pij = 1 when we sum all along the the row i. Second of all, I don't know where to...
  21. A

    Difference between martingale and markov chain

    What is the difference between martingale and markov chain. As it seems apparently, if a process is a martingale, then the future expected value is dependent on the current value of the process while in markov chain the probability of future value (not the expected value) is dependent on the...
  22. P

    Steady-state vector of infinite markov chain

    Today we encountered a problem with markov chains and are wondering if this can be solved analytically. Suppose we have a banded transition probability matrix M of the following form: M= [ P P 0 0 0 ... Q 0 P 0 0 ... 0 Q 0 P 0 ... 0 0 Q 0 P ... 0 0 0 Q 0 ... . . . . . . . . . . ]...
  23. T

    Markov Chain Conditional Expectation

    Hello, in relation to Markov chains, could you please clarify the following equations: In particular, could you please expand on why the first line is equal. Surely from , along with the first equation, this implies that: I just don't see why they are all equal. Please could you...
  24. S

    Stationary distribution of countable Markov chain

    How do I find the stationary distribution of the Markov chain with the countable state space {0, 1, 2, ..., n, ...}, where each point, including 0, can either a. return to 0 with probability 1/2, or b. move to the right n -> n+1 with probability 1/2? Thanks.
  25. E

    How Do Transition Probabilities Determine the Behavior of a Markov Chain?

    Homework Statement Let \left( X_n \right)_{n \geq 0} be a Markov chain on {0,1,...} with transition probabilities given by: p_{01} = 1, p_{i,i+1} + p_{i,i-1} = 1, p_{i,i+1} = \left(\frac{i+1}{i} \right)^2 p_{i,i-1} Show that if X_0 = 0 then the probability that X_n \geq 1 for all n \geq 1 is...
  26. S

    How to Obtain the Transition Probability Matrix in a Birth Death Markov Chain?

    Hi I am trying to model the behaviour of 2 independent ON-OFF sources. My state diagram is as follows state 0 = both sources are OFF state 1 = 1 of the sources are ON state 2 = both sources are ON The transition rates are given as BIRTH RATE = lamda(i) = (N-I)*lamda DEATH RATE =...
  27. E

    Solving Markov Chain Problem for Water Distribution Co. in California

    A water distribution company in southern California gets its water supply from the north and sell it back to its customers in Orange county. Assume the following simplified scheme: 3 MG (millions of gallons) of water arrives from the north at the beginning of the month. The company can store up...
  28. J

    Markov Chain - Find the Stationary Distribution

    Ok, I had a homework problem that I cannot for the life of me, figure out. I've tried to google for different sources that would show me how to find the stationary distribution of a markov chain, but I can't seem to find one that makes sense to me. The transition matrix of a markov chain is...
  29. R

    [markov chain] reading expected value from the transition matrix

    Hello there, yet another trivial problem: I've attended the 'stochastic process' course some time ago but the only thing I remember is that this kind of problem is really easy to compute, there is some simple pattern for this I presume. thanks for your help, rahl.
  30. S

    Markov Chain of Stochastic Processes

    I would like to construct a model using a markov chain that has different stochastic processes for each state in the chain. Is there a term for such a thing, or anything similar to it? Thanks
  31. Y

    How do i show that a markov chain is irreducible?

    how do i show that a markov chain is irreducible?
  32. N

    Exponential of (Markov Chain) Transition matrix

    Hi, I have a (markov chain) transition matrix X which I understand. In particular each row of this matrix sums to 1. I have used this transition matrix to construct it's generator, Y. I.e. Y is the continuously compounded transition matrix, X = exp(Y) X*X = exp(2Y), etc both X and Y...
  33. F

    Markov chain and state diagram

    Homework Statement I have the following queuing system: http://img39.imageshack.us/img39/8264/immaginetd.jpg that models voice traffic that come up with \alpha e \beta parameters, on both queue 1 and 2. When a source of voice is active causes traffic with exponential inter-arrival time which...
  34. H

    Markov Chain Homework: Expressing Transition Matrix P in Terms of G

    Please Help! Markov Chain Homework Statement Let X0 be a random variable with values in a countable set I. Let Y1, Y2, ... be a sequence of independent random variables, uniformly distributed on [0, 1]. Suppose we are given a function G : I x [0, 1] -> I and define inductively for n >=...
  35. A

    Rat and Cat Markov Chain

    Homework Statement Rat and Cat move between room 1 and 2 using different paths. Their motions are governed by their respective transition matrices: [0.9, 0.1 ; 0.2, 0.8] [0.6, 0.4 ; 0.3, 0.7] (semi colon is a new line in the matrix, like in matlab) If they are ever in the same room...
  36. S

    Help with determining the transition matrix for a markov chain

    Homework Statement well i have my algebra exam coming up and my teacher told us that there is going to be a markov chain problem. the only problem i have is that i don't know how to get the initial transition matrix, which is crucial in getting full marks. can someone help me in determining how...
  37. S

    Solving Markov Chain Problem for Proportions in Areas A, B & C

    Homework Statement i have a scenario which i have to find the proportion of time spent in each area by a person using markov chains. i was given a word problem, which i have put into a matrix and the question asks what the proportion of time is spent in each area A, B and C. Homework...
  38. J

    How Do You Calculate the Probability of a Rat Being in Room 4 After Two Moves?

    I've attached the diagram of 4 rooms, which a rat must move through. Each period he changes his room (his state). As you can see if you click on the image, the rat cannot access room 2 from 3, vice versa. If I assume the rat begins in room 1, how do I calculate the probability he will be in...
  39. I

    Markov chain calibration to a set of cumulated frequencies.

    Homework Statement Hi! I have been given such a task: A population of firms can assume three states: good-bad-bankrupt (default) The cumulated frequencies of default (DP) from year 1 to 10 are given. Find an appropriate transition matrix (TM) I'm given a matrix of historical cumulated...
  40. C

    Markov chain - stationary distribution -

    Homework Statement -A taxi is located either at the airport or in the city. From the city, the next trip is to the airport with 1/4 probability, or to somewhere else in the city with 3/4. From the airport, the next trip is always to the city. (a) find the stationary distribution (b)...
  41. ?

    Aperiodicity of a Markov Chain

    Homework Statement Transition matrix is 0 0 1 0 0 1 (1/3) (2/3) 0 "argue that this chain is aperiodic" Homework Equations definition of aperiodicity - there must exist a time n such that there is a non-zero probability of going from state i to state j for all i & j The...
  42. ?

    Aperiodicity of a markov chain

    my transition matrix is 0 0 1 0 0 1 (1/3) (2/3) 0 I'm supposed to argue that this chain is aperiodic, A markov chain is aperiodic iff there exists a time n such that there is a positive probability of going from state i to state j for all i and j...
  43. P

    Stationary distribution of Markov chain

    Hi all, I'm given a Markov chain Q_k, k>0 with stationary transition probabilities. The state is space uncountable. What I want to show is that the chain is asymptotically stationary, that is it converges in distribution to some random variable Q. All I have at hand is an k-independent upper...
  44. T

    Markov Chain/ Monte Carlo Simulation

    Let \bold{X} be a discrete random variable whose set of possible values is \bold{x}_j, \ j \geq 1 . Let the probability mass function of \bold{X} be given by P \{\bold{X} = \bold{x}_j \}, \ j \geq 1 , and suppose we are interested in calculating \theta = E[h(\bold{X})] =...
  45. C

    Solve Markov Chain Problem: Find Initial Probability Vector p(1)

    [SOLVED] Markov chain problem Hi, all! This is not a homework thing or something like that. I'm just curious. Suppose a Markov chain p(n+1)=Q*p(n), where Q=[q11, q12; q21, q22] is the transition probability matrix and p(n)=[p1(n);p2(n)] is the probability vector. Given Q and p(1), we can...
  46. S

    What is the Power Spectrum of a Markov Chain?

    I'm reading the wikipedia article on them and I can't really get an understanding of what they are. I'm writing a paper for psychology, and I keep coming across articles that say 'learning can be modeled with markov chains' what does that mean?
  47. Z

    Calculating Probability of Markov Chain State in Discrete Time

    Let \left( {X_n } \right)_{n \ge 0} be a Markov chain (discrete time). I have {\bf{P}} = \left[ {pij} \right]_{i,j} = \left[ {P\left( {X_1 = j|X_0 = i} \right)} \right]_{i,j}, and the initial probability distribution {\bf{p}}^{\left( 0 \right)}. I need to calculate P\left(...
Back
Top