Chapter III: Markov chains
Admistrivia
- Questions from the homework?
Chapter III.1: Defns
Markov process
- almost "memory less"
- or better--only short term memory
-
P(Xt|X1,X2,...Xt-1) =
P(Xt|Xt-1)
- Much stronger than martingale since probabilities fully describe
the world
Examples
- the market (even stochastic volitility models are Markov)
- good marketing decisions
- quantum mechanics
- populations (both humans and viruses)
- genetics
- Human speach understanding
Transition probability
Pij = Probability of i --> j
- matrix of probabilities
- Along with a starting distribution FULLY describes the world
(Theorem at end of class)
- Just keep multiplying up to get probably of any sequence
Rows sum to one
Probability of starting in i and going anywhere. Hence law of total
probabiltiy applies.
Infinite matrixes!?!
- Why? Populations for example are most easilly modeled as being
unbounded.
- Difficult to draw--easy to notate
- Natural to consider only a piece of matrix and end up with "sub
probabilities."
Theorem: multiplying probabilities
- probability version of proof (concrete as in book)
- proof called first step analysis
Dean P. Foster
Last modified: Thu Jan 22 11:52:10 EST 2009