About 50 results
Open links in new tab
  1. What is the difference between all types of Markov Chains?

    Apr 25, 2017 · A Markov process is basically a stochastic process in which the past history of the process is irrelevant if you know the current system state. In other words, all information about the …

  2. what is the difference between a markov chain and a random walk?

    Jun 17, 2022 · Then it's a Markov Chain . If you use another definition : From the first line of each random walk and Markov Chain , I think a Markov chain models a type of random walk , but it doesn't …

  3. Aperiodicity of a Markov chain - Mathematics Stack Exchange

    Jan 1, 2023 · For two states $x,y$ in $E$, let $p^n (x,y)$ denote the $n$ -step Markov chain transition probability from $x$ to $y$. Then the period of a point $x$ is the greatest common divisor of all …

  4. What is a Markov Chain? - Mathematics Stack Exchange

    Jul 23, 2010 · 7 Markov chains, especially hidden Markov models are hugely important in computation linguistics. A hidden Markov model is one where we can't directly view the state, but we do have …

  5. reference request - What are some modern books on Markov Chains …

    I would like to know what books people currently like in Markov Chains (with syllabus comprising discrete MC, stationary distributions, etc.), that contain many good exercises. Some such book on

  6. Prove that if $X\to Y\to Z$ is a Markov chain, then $I (X;Z)\le I (X;Y)$

    Almost, but you need "greater than or equal to." We have: $$ H (X|Y) = H (X|Y,Z) \leq H (X|Z) $$ where the first equality is from the Markov structure and the final inequality is because conditioning reduces …

  7. Why Markov matrices always have 1 as an eigenvalue

    Now in markov chain a steady state vector ( when effect multiplying or any kind of linear transformation on prob state matrix yield same vector) : qp=q where p is prob state transition matrix this means Y = …

  8. Example of a stochastic process which does not have the Markov …

    Even stochastic processes arising from Newtonian physics don't have the Markov property, because parts of the state (say, microscopic degrees of freedom) tend not to be observed or included in the …

  9. Generalisation of the Markov property to stopping times

    Aug 1, 2023 · So apparently it is a different way of generalising the weak Markov property. Broadly speaking, I would like to know whether this property ($\star$) has a name and under what conditions …

  10. Definition of Markov operator - Mathematics Stack Exchange

    Mar 26, 2021 · Is this a type of Markov operator? (The infinitesimal generator is also an operator on measurable functions). What's the equivalence between these two definitions and what's the intuition …