previous up next print clean
Next: TRANSMISSIVITY SIMULATION Up: MODELING LAYERS Previous: MODELING LAYERS

Markov Chain

To construct a Markov chain, all we need to know are the probability for each state (M-1 parameters) and the blockiness of the sequence(the single remaining parameter). In case of M = 2, telegraph matrix can be made from those parameters as follows.  
 \begin{displaymath}
P_{T}=\lambda {\pmatrix{1 & 0 \cr
 0 & 1 \cr}}
 + ( 1-\lambd...
 ...rix{ \alpha_{1} & \alpha_{2} \cr
 \alpha_{1} & \alpha_{2} \cr}}\end{displaymath} (1)
where $\lambda=$blockiness, $\alpha_{j}=$the probability of state j. The blockiness $\lambda$ is obtained from the exponential coefficient of well logs' correlation data. The left term of the right hand side describes the tendency to remain in the same state as before, and the right one is for the tendency to change into another state. Markov chains depend only on the last state. Once the current state is determined, it forgets all the previous states and choose the next state only from the current information. If the first state is state 1, the probability state vector ${\bf {\pi}}=(1,0)$. The probability for the continuous state ${\bf {\alpha}}$ becomes
\begin{displaymath}
{\pmatrix{\alpha_{1} & \alpha_{2}\cr}}
 =P_{T}
 {\pmatrix{\pi_{1}\cr
 \pi_{2} \cr}}\end{displaymath} (2)
After obtaining ${\bf {\alpha}}$, a random number (from 0 to 1) is used to choose new state and we get the new ${\bf {\pi}}$ for the next lottery.


previous up next print clean
Next: TRANSMISSIVITY SIMULATION Up: MODELING LAYERS Previous: MODELING LAYERS
Stanford Exploration Project
11/17/1997