Talk:Normal distribution

From Wikipedia, the free encyclopedia

Peer review Normal distribution has had a peer review by Wikipedia editors which is now archived. It may contain ideas you can use to improve this article.

Please add your comments to the end of the article.

Contents

[edit] Archives

[edit] error in the cdf?

You need to be more specific about what exactly you think might be wrong. --MarkSweep 00:19, 8 September 2005 (UTC)

[edit] Integrating the normal density function

can any1 tell me wat the integral of 2pi^-0.5*e^(-o.5x^2) is?? i tried interating by parts and other methods but no luck. can sum1 help

The antiderivative does not have a closed-form expression. The definite integral can be found:
\int_{-\infty}^\infty e^{-x^2/2}\,dx = \sqrt{2\pi\ }.
See Gaussian function for a derivation of this. Michael Hardy 20:40, 22 May 2006 (UTC)
I didn't find an explicit derivation in the Gaussian function article, so I created this page: Integration_of_the_normal_density_function. Would it be appropiate to place this link somewhere in the normal distribution article? Mark.Howison 06:32, 1 February 2007 (UTC)

Sorry---it's not in Gaussian function; it's in Gaussian integral. Michael Hardy 21:31, 1 February 2007 (UTC)

[edit] Gaussian curve estimation

I came to this article looking for a way to approximate the gaussian curve, and couldnt find it on this page, which is a pity. It would be nice to have a paragraph about the different ways to approximate it. One such way (using polynoms on intervals) is described here: [1] I can write it, any suggestion for where to put this ? top level paragraph before trivia ? --Nicolas1981 15:53, 6 October 2006 (UTC)

I think it would fit there. Michael Hardy 19:52, 6 October 2006 (UTC)
I added it. I felt a bit bold because it is very drafty when compared to the rest of the page, but I hope that many people will bring their knowledge and make it an interesting paragraph :-) Nicolas1981 21:37, 6 October 2006 (UTC)

I just noticed that the French article has a good paragraph about a trivial way to approximate it (with steps). There is also this table on wikisource. I have to go out now, but if anyone wants to translate them, please do :-) Nicolas1981 21:54, 20 October 2006 (UTC)

[edit] reference in The Economist

Congratulations, guys - the Economist used Wikipedia as the source for a series of pdf graphs (the normal, power-law, poisson and one other) in an article on Bayesian logic in the latest addition. Good work! --Cruci 14:58, 8 January 2006 (UTC)

[edit] Typesetting conventions

Please notice the difference in (1) sizes of parentheses and (2) the dots at the end:

Z(t)=(d/dt)\log(\int{e^{xt}f(x)dx})=\mu+\sigma^2t+...
Z(t)=(d/dt)\log\left(\int{e^{xt}f(x)dx}\right)=\mu+\sigma^2t+\cdots

Michael Hardy 23:54, 8 January 2006 (UTC)

[edit] Quick compliment

I've taught intro statistics, I've treatments in many textbooks. This is head and shoulders above any other treatment! Really well done guys! Here is the Britannica article just for a point of comparison (2 paragraphs of math with 2 paragraphs of history) jbolden1517Talk 18:54, 5 May 2006 (UTC) <clapping>

Thank you. (Many people worked on this page; I'm just one of those.) Michael Hardy 22:02, 5 May 2006 (UTC)

[edit] Eigenfunction of FFT?

I believe the normal distribution is the eigenfunction of the Fourier transform. Is that correct? If so, should it be added? —Ben FrantzDale 16:57, 26 June 2006 (UTC)

That was quick. According to Gaussian function, all Gaussian functions with c2=2 are, so the standard normal, with σ=1, is an eigenfunction of the FFT. —Ben FrantzDale 16:57, 26 June 2006 (UTC)

[edit] q-function

i'm trying to find out what a q-function is, specifically q-orthogonal polynomials. I searched q-function in the search and it came here. I'm guessing this is wrong. —The preceding unsigned comment was added by 149.169.52.82 (talkcontribs).

[edit] Added archives

I added archives. I tried to organize the content so that any comments from 2006 are still on this page. There was one comment from 2006 that I didn't think was worth keeping. It's in the 2005 archive. If you have any questions about how I did the archive, ask me here or on my talk page. — Chris53516 (Talk) 14:51, 7 November 2006 (UTC)

[edit] Can you please link the article to the Czech version

Hello, can you please link the article to the Czech version as follows?

cs : Normální rozdělení

I would do it myself but as I see some characters as question marks in the main article I am afraid that I would damage the article by editing it. Thank you. —Dan

Ok, I did it. Check out how it is done, so you can do it yourself in the future. PAR 10:47, 12 November 2006 (UTC)

[edit] Kurtosis Clarity

Is there a way to make clear that the kurtosis is 3 but the excess kurtosis (listed in table) is 0? Some readers may find this confusing, as it isn't explicitly labeled.

Well, it looks clunky, but I changed it. PAR 01:48, 15 November 2006 (UTC)

huh?-summer

what? PAR 00:42, 14 December 2006 (UTC)

[edit] Standard normal distribution

In the section "Standardizing normal random variables" it's noted that "The standard normal distribution has been tabulated, and the other normal distributions are simple transformations of the standard one." Perhaps these simple transformations should be discussed? —The preceding unsigned comment was added by 130.88.85.150 (talkcontribs) 11:36, 4 December 2006 (UTC).

They are discussed in the article, just above the sentence that you quote. Michael Hardy 21:58, 4 December 2006 (UTC)
I reworded the section slightly to make that clearer. --Coppertwig 04:35, 5 December 2006 (UTC)

[edit] Jonnas Mahoney?

Um... This is my first time commenting on anything on Wiki. There seems to be an error in the article, although I'm not certain. Jonnas Mahoney... should really be Johann Carl Friedrich Gauss? Who's Jonnas Mahoney? :S

Edit: lol. fixed. that was absolutely amazing.

-- —The preceding unsigned comment was added by Virux (talkcontribs).

[edit] PDF function

I believe there is an error in the pdf function listed, it is missing a -(1/2) in the exponent of the exp!!! —The preceding unsigned comment was added by 24.47.176.251 (talk) 19:02, 11 December 2006 (UTC).

Well, scanning the article I find the first mention of the pdf, and clearly the factor of −1/2 is there, where it belongs:
The probability density function of the normal distribution with mean μ and variance σ2 (equivalently, standard deviation σ) is a Gaussian function,
f(x;\mu,\sigma) = \frac{1}{\sigma\sqrt{2\pi}} \, \exp \left( -\frac{(x- \mu)^2}{2\sigma^2} \right) = {1 \over \sigma} \varphi \left( \frac{x - \mu}{\sigma} \right),
where
\varphi(x)=\frac{1}{\sqrt{2\pi\,}} e^{-x^2/2}
is the density function of the "standard" normal distribution, i.e., the normal distribution with μ = 0 and σ = 1.
similarly I find the factor of −1/2 in all the other places in the article where the density is given. Did I miss one? If so, please be specific as to where it is found. Michael Hardy 21:26, 11 December 2006 (UTC)

[edit] Definition of density function

I know I'm probably being somewhat picky, but here goes: In the section "Characterization of the Normal Distribution," we find the sentence:

The most visual is the probability density function (plot at the top), which represents how likely each value of the random variable is.

This statment isn't technically accurate. Since a (real-valued) Gaussian random variable can take on any number on the real line, the probability of any particular number occuring is always zero. Instead, the PDF tells us the probability of the random variable taking on a value inside some region: if we integrate the pdf over the region, we get the probability that the random variable will take on a number in that region. I know that the pdf gives a sort of visual intuition for how likely a particular realization is, so I don't want to just axe the sentance, but maybe we can find a way to be precise about this while avoiding an overly pedantic discussion like the one I've just given? Mateoee 19:46, 12 December 2006 (UTC)

I took a try at it, staying away from calculus. It's still not correct, but its closer to the truth. PAR 23:50, 12 December 2006 (UTC)
I think I found a way to be precise without getting stuck in details or terminology. What do you think? Mateoee 03:19, 14 December 2006 (UTC)
Well, its correct, but to a newcomer, I think its less informative. Its a tough thing to write. PAR 03:41, 14 December 2006 (UTC)

The new version seems a bit vague. But I don't think this article is the right place to explain the nature of PDFs. It should just link to the article about those. Michael Hardy 17:05, 14 December 2006 (UTC)

[edit] Summary too high depth

I linked this article for a friend because they didn't know what a normal distribution was. However the summary lacked a breif english language notion of what one is. The summary is confusing for people who haven't had some statistics. If there's not immense negative reaction to altering the summary, I'll do that tommorow. i kan reed 23:08, 2 January 2007 (UTC)

[edit] weird way of generating gaussian

Does anyone know why the following method works?
Generate n random numbers so that n>=3
Add the results together
Repeat many times
Create a histogram of the sums. The histogram will be a "gaussian" distribution centered at n/2. I put "gaussian" in quotes because clearly the distribution will not go from negative infinity to infinity, but will rather go from 0 to n. It sounds bogus, but it really works! I really wish I knew why though. --uhvpirate 23:04, 16 January 2007 (UTC)

The article titled central limit theorem treats that phenomenon. Michael Hardy 01:34, 17 January 2007 (UTC)

[edit] lattice distribution

Can someone add a link about lattice distribution? Of course, and add an article about lattice distribution. Jackzhp 23:40, 7 February 2007 (UTC)

[edit] Open/closed interval notation

In this sentence: "' uniformly distributed on (0, 1], (e.g. the output from a random number generator)" I suspect the user who called this a "typo" and changed it to "[0, 1]" (matching square brackets) didn't understand the notation. "(0, 1]" means an interval that includes 1 but does not include 0. "[0, 1]" includes both 0 and 1. Each of these intervals also includes all the real numbers between 0 and 1. It's a standard mathematical notation. Maybe we need to put a link to a page on mathematical notation? --Coppertwig 13:10, 13 February 2007 (UTC)

[edit] sum-of-uniforms approximation

The sum-of-uniforms approximate scheme for generating normal variates cited in the last section of the article is probably fine for small sets (<10,000), but the statement about it being 12th order is misleading. The moments begin to diverge at the 4th order. Also, note that this scheme samples a distribution with compact support (-6,6); so it is ill-advised for any application that depends on accurate estimation of the mass of extreme outcomes. JADodson 18:58, 15 February 2007 (UTC)

[edit] Complex Gaussian Process

Consider complex Gaussian random variable,

z=x+y\,i

were x and x are real Gaussian variables, with equal variances σr = σx = σy. The pdf of the joint variables will be,

\frac{1}{2\,\pi\,\sigma_r^2} e^{-\frac{x^2+y^2}{2 \sigma_r ^2}}

since \sigma_z=\sqrt{2}\sigma_r, the resulting PDF for the complex Gaussian variable is,

\frac{1}{\pi\,\sigma_z^2} e^{-\frac{|z|^2}{\sigma_z^2}}.