Wikipedia:Articles for deletion/Least-squares estimation of linear regression coefficients
From Wikipedia, the free encyclopedia
- The following discussion is an archived debate of the proposed deletion of the article below. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as the article's talk page or in a deletion review). No further edits should be made to this page.
The result of the debate was Keep. Deathphoenix 21:24, 27 February 2006 (UTC)
[edit] Least-squares estimation of linear regression coefficients
This article is worthless nonsense. Unfortunate, since obviously a lot of work went into it. I've written some comment on the article's discussion page. Michael Hardy 23:42, 5 February 2006 (UTC)
- I corrected a couple of things. You might want to reconsider deleting the article as it provides a proof of the formula used in the regression analysis article. Please tell me what you think. Regards, Deimos 28 13:58, 6 February 2006 (UTC).
- It now says: "In this paragraph we first show that under the Gauss-Markov assumptions, the linear regression problem can be seen as a projection. This will give us a motivation for chosing an optimization criterion from which we will then derive the expression of the least-squares estimator." That looks to me as if it's intended to prove the Gauss-Markov theorem. But the whole thing is so badly written that it will take me a while to find out if that's really the intent. If it is, it's very badly done; the proof of the Gauss-Markov theorem is much simpler than any of what's written on this page. The parts that are clearly comprehensible seem written in a way that will make it incomprehensible to anyone who does not already know this material, and they bring in things that are not relevant to that purpose. Michael Hardy 21:14, 17 February 2006 (UTC)
- No, the aim is not to prove the Gauss-Markov theorem. I actually give a link to the Gauss-Markov theorem article in the conclusion. I really don't know how to state more plainly what this article is about than I have in its introduction:
-
- (1) give a motivation for the use of least-squares (why choose this particular criterion? What are its advantages?),
- (2) derive the expression of the least-squares estimator.
I have trouble seeing where you can see me making a proof of the Gauss-Markov theorem in either (1) or (2). It's true that the Gauss-Markov theorem gives the least-squares estimator one of its great strengths, but appart from that, I don't mention this theorem at all. As for the expression of the least-squares coefficients, it is not necessary to go through the proof of the Gauss-Markov theorem to derive it. If something is wrong with this article, I am more than happy to correct it. You say some parts are "clearly comprehensible": I am more interested in knowing what parts you think are not understandable. The article obviously makes perfect sense to me as I've written it: could you please pinpoint what you think is gibberish? Deimos 28 00:17, 18 February 2006 (UTC)
-
-
- The Gauss-Markov theorem is often proposed as an answer to the question "why use least squares"? You stated the assumptions of that theorem. I don't see you "making a proof of the Gauss-Markov theorem" nor a proof of anything else, yet. But when you start with the Gauss-Markov assumptions, and say your trying to motivate this method, that sounds a lot like intent to prove the Gauss-Markov theorem. As for deriving the expression of the least-squares estimator, probability theory is not involved in that at all. It's just linear algebra. Michael Hardy 01:01, 18 February 2006 (UTC)
-
- This afd nomination was orphaned. Listing now. —Crypticbot (operator) 19:48, 19 February 2006 (UTC)
- Keep Least-squares estimation of linear regression coefficients has 646,000 google hits, how about you guys fix the article instead of taking it to AfD to make a point? --Ruby 22:10, 19 February 2006 (UTC)
-
-
-
- As I show in the article, the Gauss-Markov theorem is not the only motivation for least-squares. As can also be seen in the article, I do not use probability theory to prove the expression of the estimator. However, I do use probability theory to prove that the regression problem is equivalent to an orthogonal projection given the scalar product . This is, I think, a fairly natural way to introduce least-squares and the main aim of this article. The least-squares estimator is simply an estimation of an orthogonal projection, using the Euclidean scalar product in instead of the one defined for random variables. The results of the Gauss-Markov theorem is indeed the main asset of this method, but it does not provide a way to construct the estimator. I state the hypothesis and refer to the Gauss-Markov theorem because it provides both a main asset and a main limitation of the leaast-squares estimator. Deimos 28 00:06, 20 February 2006 (UTC)
-
-
-
-
-
-
- "the regression problem is equivalent to an orthogonal projection given the scalar product " Deimos, that is as clear a statement as I have ever seen you write. If you did the same in the article, I wouldn't complain so much. But I'm not convinced your statement above is true. We're dealing with two different inner products: the one you just mentioned here, and the one in the finite-dimensional space in which the data are found. Also, please see my latest round of comments on the article's discussion page. Michael Hardy 00:58, 20 February 2006 (UTC)
-
-
-
- Keep. Subject is obviously notable. So fix it. Monicasdude 22:22, 19 February 2006 (UTC)
Hi there. I am the author: I took into account Michael Hardy's first suggestions, but since then he hasn't removed the deletion tag nor told me what he thought was wrong with it. Could anybody please tell me what I have to change/improve or remove the deletion tag? I do not want to remove this tag myself... Thanks in advance, Deimos 28 23:42, 19 February 2006 (UTC)
- Keep it's a natural breakout from Regression analysis. But this stuff is in every standard textbook so there shouldn't be a problem in fixing it. Dlyons493 Talk 00:02, 20 February 2006 (UTC)
- Comment to Dlyons and Ruby: So the topic is notable. I am more aware of that fact than you are. What is in the content of the article that is not in Gauss-Markov theorem and linear regression that is worth keeping? Michael Hardy 00:28, 20 February 2006 (UTC)
-
-
- OK, I know at least as much about the notability of the topic as anyone does. Michael Hardy 00:54, 20 February 2006 (UTC)
-
- Keep. Further merging/redirecting discussions are for talk: pages rather than AfD. -- Jonel | Speak 02:19, 20 February 2006 (UTC)
- Comment: How do you figure that? Some policy I missed? —Wknight94 (talk) 04:06, 20 February 2006 (UTC)
- Sorry, let me expand on that. Keep, as the method is notable and would conceivably be searched for on Wikipedia. I have no opinion on whether it should be merged or redirected or kept as a separate article. So, I am saying that the article should not be deleted. Beyond that, discussions may continue elsewhere. -- Jonel | Speak 06:00, 20 February 2006 (UTC)
- Comment: How do you figure that? Some policy I missed? —Wknight94 (talk) 04:06, 20 February 2006 (UTC)
- I also wrote most of the regression analysis article. I made this a separate article because I think it is a special case of regression. Although the linear regression article is, I think, a good introduction, it does not provide a proof for the expression of the least-squares estimator and is not very rigorous: the random variables are not defined (indeed, I think the expression "random variable" is not mentioned once in the article), and it restricts itself to the case of first degree polynomial regression. I think it is fine for people with little mathematical knowledge and that the article should stay this way. However, nowhere else that I am aware of (be it in the Gauss-Markov theorem article or in the linear regression article) do we find a proof for the general expression of the least-square estimator. Also, I think it is very important to mention the geometrical interpretation (i.e. seeing the regression as an orthogonal projection): indeed, it is the same kind of reasoning as used in Fourier analysis for example. It shows why we are minimizing the sum of squares instead of (for example) the sum of the absolute values. I think it is important to distinguish between people who just need some simple formula they can apply without having all the theoretical details and those who wish to study the problem in more depth. This article (like some of the regression analysis article) is written for the latter. Deimos 28 10:45, 20 February 2006 (UTC)
- Definite keep. Regression is one of my major interests, it's definitely notable and not a complete mess. Stifle 11:53, 21 February 2006 (UTC)
-
- Thanks! Nice to have positive feedback once in a while! Deimos 28 12:28, 21 February 2006 (UTC)
- The above discussion is preserved as an archive of the debate. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as the article's talk page or in a deletion review). No further edits should be made to this page.