Stationary probabilities.(Markov chain).

  • Thread starter MathematicalPhysicist
  • Start date
  • Tags
    Chain
In summary, the conversation discusses trying to characterize the probability of the number of visits in state 2 after two consecutive visits in state 1 in an irreducible and positive recurrent Markov chain. The formula for this probability is derived based on the number of consecutive failures and successes, as shown in the paper referenced by one of the speakers. The general equation involves multiplying by the stationary probabilities of states 1 and 2 and the transition probabilities between states 1 and 2.
  • #1
MathematicalPhysicist
Gold Member
4,699
371
We are given two states 1,2 in an irreducible and positive recurrent Markov chain, and their stationary probabilities [tex]\pi_1[/tex] and [tex]\pi_2[/tex] respectively, try to characterise in general the probability (distribution) of the number of visits in state 2 after two consecutive visits in state 1.

Any hints?
 
Physics news on Phys.org
  • #2
Write out the ways this can happen, then turn it into a formula:

1,1,2
2,1,1,2
2,2,1,1,2
1,2,1,1,2
...
 
  • #3
Yes I thought in this direction but not sure how to get to the formula.
I mean for 1,1,2 the probability is: [tex]P_{1,1}\frac{\pi_1}{\pi_1+\pi_2}P_{1,2}[/tex]
2,1,1,2 [tex]\frac{\pi_2}{\pi_1+\pi_2}P_{1,1}P_{2,1}P_{1,2}[/tex]
1,2,1,1,2 [tex]\frac{\pi_1}{\pi_1+\pi_2}P^2_{1,2}P_{1,1}P_{2,1}[/tex]

So my hunch is that if we first get to 1 or 2, we should multiply by pi1/(pi1+pi2) or pi2/(pi1+pi2), and we should always multiply by P1,1, but other than this I don't see a general equation for all cases.
 
  • #4
I don't have an immediate answer, but you might find the approach in this paper to be helpful:

http://smu.edu/statistics/TechReports/TR211.pdf

The authors derive the unconditional distribution of the number of successes (e.g., state 2) in n+1 trials. It seems to me that you need to derive a similar distribution, conditional on having obtained (exactly? or at least?) two consecutive failures (1,1).
 
Last edited by a moderator:

Related to Stationary probabilities.(Markov chain).

1. What is a Markov chain?

A Markov chain is a mathematical model that describes a sequence of events or states where the probability of transitioning from one state to another depends only on the current state, and not on the previous states.

2. What are stationary probabilities in a Markov chain?

Stationary probabilities, also known as steady-state probabilities, are the long-term probabilities of being in a certain state in a Markov chain. They represent the equilibrium distribution of states in the chain after many transitions.

3. How are stationary probabilities calculated?

Stationary probabilities can be calculated by finding the eigenvector corresponding to the eigenvalue of 1 for the transition probability matrix of the Markov chain. This eigenvector represents the stationary probabilities for each state.

4. What is the significance of stationary probabilities in a Markov chain?

Stationary probabilities provide important insights into the behavior and stability of a Markov chain. They allow us to understand the long-term trends and patterns of the system, and can be used to make predictions about future states of the chain.

5. Can stationary probabilities change over time in a Markov chain?

No, stationary probabilities do not change over time in a Markov chain. Once the chain reaches equilibrium, the probabilities of being in each state remain constant, regardless of the initial state or number of transitions.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
Back
Top