Talk:Markov property
From Wikipedia, the free encyclopedia
I suppose that some description of the formulas would be welcomed - like it is done for Poisson process.
I believe that the Markov property is only the left hand side of the 1st formula. The right hand side states that this is a first order Markov chain. Can somebody confirm this?
[edit] Brownian Motion
The most famous Markov processes are Markov chains, but many other processes, including Brownian motion, are Markovian.
What disqualifies Brownian motion from being a continuous-time Markov chain? Should the above say "are discrete-time Markov chains"? Josh Cherry 02:56, 15 Nov 2004 (UTC)
Upon further reading of continuous-time Markov chain, I suspect that I know the answer. But then isn't the first sentence of continuous-time Markov chain too loose a definition? Josh Cherry 03:03, 15 Nov 2004 (UTC)
- I agree; the 'chain' part is poorly specified. In a Markov chain there is a sequence of states visited by the process (such as the count of a Poisson process), rather than a continuous path (as in [{Brownian motion]]). I'm planning some further edits to that page, so (if no one else does in the meantime) I'll incorporate this too. Ben Cairns 22:52, 27 Jan 2005 (UTC).
[edit] redirect
I've redirected this to Markov property because it was worthless. There is an article titled Markov chain that treats discrete-time Markov processes. Markov property does not assume discrete time. This article assumed (incorrectly) not only discrete time but also a state space that was not merely discrete but actually finite. In effect, this denies that the standard Wiener process (and many others) is a Markov process! Michael Hardy 14:51, 23 June 2006 (UTC)
[edit] continuous and discrete time
Hmm, this article gives the arkov property only for continuous time systems; it would be nice if it included the defn for discrete-time systems as well, or at leaast recapping sufficiently before telling the reader to go to read about markov chains. linas 23:59, 28 August 2006 (UTC)