Talk:Convergence of random variables

From Wikipedia, the free encyclopedia

WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, which collaborates on articles related to mathematics.
Mathematics rating: B Class Mid Priority  Field: Probability and statistics

Contents

[edit] Advanced Weak Convergence Results

This article does not include any advanced weak convergence results, i.e. convergence in general metric spaces (for example Donsker's theorem links to this page, yet there is no mention of the type of convergence actually happening in this theorem). This should be included. --Steffen Grønneberg 17:27, 13 April 2006 (UTC)


[edit] plim

Has anyone else seen the "convergence in probability" function being denoted as plim? I recently saw an old textbook as part of a CFA primer course that used this notation, but I have not had success googling for this notation anywhere. Can someone confirm that this is valid notation? If so, can we add it to this article? - DropDeadGorgias (talk) 17:48, Jun 22, 2004 (UTC)


---I can confirm this notation. My econometrics text (Kmenta, Jan. Elements of Econometrics: 2nd Ed. University of Michigan Press: Ann Arbor, 1987.) includes this notation.

- Seconded - see Page 47 of these course notes for example: http://www.econ.bbk.ac.uk/courses/msceconomics/sep/SeptStats05.pdf

[edit] uniform integrability

good work! the article though lacks from further convergence implications when the r.v. are uniformly integrable.

[edit] Uniform convergence?

What's the view on adding a section on uniform convergence? Or should that be treated under stochastic processes? If so, perhaps a link to the relevant article would be worthwhile?--Steve Kroon 14:14, 14 February 2007 (UTC)


[edit] Examples

This article would be greatly improved if some examples of sequences of measures could be added which satisfy one convergence concept but not others.

[edit] convergence and the weak law of large numbers

The article says that convergence in distribution is the notion of convergence used in the weak law of large numbers, and that convergence in probability is the notion of convergence used in the weak law of large numbers. One of these statements should be false since the two notions of convergence are different, right?

130.237.43.140 09:39, 28 March 2007 (UTC)

The article says that convergence in distribution is the notion of convergence used in the weak law of large numbers,

I can't find anything in the article saying that. Can you point it out explicitly? Michael Hardy 22:43, 28 March 2007 (UTC)

Sorry for the delay, this is from the article:

Convergence in distribution is the weakest form of convergence, and is sometimes called weak convergence (main article: weak convergence of measures). It does not, in general, imply any other mode of convergence. However, convergence in distribution is implied by all other modes of convergence mentioned in this article, and hence, it is the most common and often the most useful form of convergence of random variables. It is the notion of convergence used in the central limit theorem and the (weak) law of large numbers.

130.237.43.140

I noticed the same thing -- which of the two statements is correct? --Lavaka 17:00, 18 May 2007 (UTC)
Both are correct. The expected value which the sample mean converges to is a constant, which gives us that convergence in probability and convergence in distribution are equivalent in this case.Aastrup 23:29, 24 July 2007 (UTC)

[edit] Convergence to random variables

This article seems to take for granted the difference between converging to a function (e.g., sure convergence and almost sure convergence) and converging to a random variable (e.g., the other forms of convergence). Note that you can almost surely convergence to a function that is not a random variable (i.e., not a Borel measurable function. It would be nice if this was cleared up. This would make the definitions make more sense. --TedPavlic 18:47, 8 April 2007 (UTC)

[edit] Convegence of NETS and General Stochastic Processes

All of the convergence ideas are in terms of SEQUENCES. That is, they all involve countable index sets. Convergence is defined for uncountable index sets in general. These definitions should be adjusted so that they refer to any general random process, regardless of the countability of its index set. --TedPavlic 18:47, 8 April 2007 (UTC)

[edit] The differences between convergence in probabililty and convergence in distribution?

Economier 17:24, 11 June 2007 (UTC) I can't understand the differences between them. Even Almost sure convergence, too. Is there anyone who can let me know the differences. I'll be really appreciating that if he/she does. FYI, I am a graduate student who is majoring Econometrics. (1st semester) I'm looking forward to hearing a good helper for me.