How to simulate basic markov chain
2 views (last 30 days)
Show older comments
Hi,
I'm fairly new to matlab. Would anybody be able to show me how I would simulate a basic discrete time markov chain?
Say for example I have a transition matrix with 3 states, A, b and C, how could I simulate say 20 steps starting from state A?
A B C
A .3 .2. .5
B .2 .1. .7
C .1 . 5 .4
Any help would be greatly appreciated.
Regards
John
0 Comments
Answers (1)
Doug Hull
on 12 Oct 2012
Edited: Doug Hull
on 12 Oct 2012
Are you looking to do a simple matrix multiply?
v = [1 0 0]
m = [0.3 0.2 0.5; 0.2 0.1 0.7; 0.1 0.5 0.4]
v = v * m
You can also do this in a loop.
If you want to square the matrix element by element
m = m.^2
but more likely you wish to square the matrix
m = m^2
You can do this for higher powers:
m = m^20
And putting it together:
>> v = v*m^20
v =
0.1652 0.3217 0.5130
0 Comments
See Also
Categories
Find more on Markov Chain Models in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!