Talk:Statistical significance

From Wikipedia, the free encyclopedia

I'm not an expert, but this article is mixing two independent concepts in statistics. See: http://hops.wharton.upenn.edu/ideas/pdf/Armstrong/StatisticalSignificance.pdf

Should this be merged or linked to the article p-value?

its has been so helpful.You people are simply statistics.Italic text

Shouldn't critical value get its own article?

Contents

[edit] Comments moved from article

131.130.93.136 put the following at the top of the article

This article needs revisions regarding both content and clarity. Please use google to look up the definition of a null hypothesis aswell as statistical significance...THIS ARTICLE IS HORRENDOUS.


The following article seems to have an error, as statistical significance is defined the other way round than used here.
The cited significance level of 5% actually is known as alpha error or error of the first kind or Type I error, whereas the significance level is 95%. Thus, the comparison of two significance levels of 99% and 95% obviously results in the facts stated below.
The statistical power is defined as 1-beta, beta being the Type II error or error of second kind.
The original article below:

I find this anonymous user's comments to be without merit. Since it's anonymous, I don't think any further comment is needed. Michael Hardy 01:20, 20 Nov 2004 (UTC)

To my mind, there is a confusion in the article between Type I error (α) OF A TEST, which has to be decided a priori, and significance level (p value) OF A RESULT, which is a posteriori. I don't regard this confusion as very serious, but some people do. --Henri de Solages 21:57, 10 December 2005 (UTC) I'd like to see this corrected. Smoe

Added link from Level to Quantile. Maybe this should move to "See also". Smoe 21:22, 15 January 2006 (UTC)

[edit] Some major changes,mostly in tone

I removed the final paragraph only after I had written what is now the second paragraph, and noted it covered the same ground. I tried to convey the same information somewhat less formally and put in a more prominant position. I did this because I think this is of great importance to lay persons who are constantly confused by the concept of significance. This is a serious social issue, since pharmaceutical companies are willfully misleading the public by the use of this term. There are drugs out there, Aricept is one, that has a trivial effect, and no long term effect on curtailing Alzheimer’s, yet was approved and sold because of its “significant” effect, the degree of which is not even described.

With all due respect for those whose work I am being presumptuous enough to modify, it is those without the benefit of a good college statistics course who need this article. I do not believe I “dumbed it down” but rather attempted to make it more accessible. I, of course, left untouched the technical description which is excellent.

I also included the paragraph on spurious significance of multiple groups, which is another way that the public can be confused. I will follow up with a reference to the recent women's study, or someone else can do it, if they choose

I would welcome any comments Arodb 01:11, 26 February 2006 (UTC)

Arodb, I've moved your edits. I think your points should be made, but I felt that the article in its previous form was a masterpiece of succinctness, so I created a section for them in order to restore the clarity of the original. BrendanH 21:04, 20 March 2006 (UTC)

[edit] First Sentence Confusing

"In statistics, a result is significant if it is unlikely to have occurred by chance, given that in reality, the independent variable (the test condition being examined) has no effect, or, formally stated, that a presumed null hypothesis is true."

I understand and agree with everything up to the second comma. After the comma it appears to say that "the independent variable has no effect in reality" which of course depends on the situation... could someone reword it? --Username132 (talk) 03:58, 16 April 2006 (UTC)

[edit] More Confusion

"For example, one may choose a significance level of, say, 5%, and calculate a critical value of a statistic (such as the mean) so that the probability of it exceeding that value, given the truth of the null hypothesis, would be 5%. If the actual, calculated statistic value exceeds the critical value, then it is significant "at the 5% level"."

What is the word "it" in reference to? --Username132 (talk) 04:14, 16 April 2006 (UTC)


[edit] Small cleanup

The article seems messy right now. The first paragraph in particular was horrible. I've altered some parts for clarity and to try and make it more concise. Let me know what you think (particularly about the opening paragraph - I'm thinking more should be added to that). --Davril2020 06:21, 31 October 2006 (UTC)

I tried to make the opening paragraph more readable and more accessible to the ordinary person. It can still be further improved. I also added a paragraph to the "pitfalls" section (the last paragraph), describing one more pitfall. --Coppertwig 23:46, 6 November 2006 (UTC)

[edit] Popular levels of significance

I changed this, in the opening paragraph, from 10%, 5% and 1% to 5%, 1% and 0.1%. In any of the sciences where I've seen significance level used, as far as I remember, 5% is the maximum usually ever considered "statistically significant". If some people do sometimes use 10% somewhere, my editing is still not incorrect, since it's just listing some examples of commonly used levels. Coppertwig 19:53, 6 November 2006 (UTC)