Talk:Eigenvalue, eigenvector and eigenspace/Archive
From Wikipedia, the free encyclopedia
Relation to broader principles
Eigenfunctions could be given more shrift (beyond the standing wave problem) some where in this page, given their immense importance (e.g., Shrodinger's equation yielding eigenfunctions for discrete energy eigenvalues). But given the quality of this page, it is appropriate to discuss such an addition first.
Fixing up this page
Now that we have all this information in one place we need to figure out the correct structure. --MarSch 3 July 2005 12:20 (UTC)
- I agree, this page is too haphazard and it was hard to find what I was looking for (specifically how to find them). Fresheneesz 19:15, 10 December 2005 (UTC)
When I try to print the main article page, the first instances of 'eigenvector' and 'eigenvalue', in the opening para., do not appear on the hardcopy (neither do the speaker-like symbols that precede these word/links). Is this an error, or am I using an inapt coding to view, please? (Otherwise, it's a great article.) pellis@london.edu 12:11, 28 April 2006
Inifite-dimensional spaces
The concept of eigenvectors can be extended to linear operators acting on infinite-dimensional Hilbert spaces or Banach spaces.
Why is this called an extension? It looks like a straightforward application of the definition. Also, is there really a restriction to Banach spaces in infinite-dimensional cases? In any case, "Hilbert spaces or Banach spaces" is redundant since all Hilbert spaces are Banach spaces. Josh Cherry 3 July 2005 17:17 (UTC)
You're right, the definition of eigenvectors and -values does not depend on the dimension. But the spectrum of an operator is an extension of eigenvalues.
Power method
For the matrix having 2 on the diagonal and 0 off diagonal, the power method as explained in this article will fail to converge to any eigenvector, since applying the matrix is the same as multiplying by two, and this is done an infinite number of times. Am I getting something wrong? Oleg Alexandrov 4 July 2005 00:06 (UTC)
- Uhm, yes, well, I have to admit that "converge" is used in a very loose sense here. I tried to amend the text. It is still work in progress anyway. Somewhere on Wikipedia it should also be mentioned that the method fails with matrices like
- which has two eigenvalues of equal magnitude. Sigh, so much to do ... -- Jitse Niesen (talk) 4 July 2005 00:37 (UTC)
The claim that insolvability of the quintic by radicals implies no algorithm to solve polynomials in general is wrong. I have rephrased that section. Stevelinton 09:56, 29 August 2005 (UTC)
Physical significance of eigenvectors
I find the article to be fairly accessible to those who are not math gurus but too much emphasis in the article is on the abtruse language generally employed by mathematicians to describe what I colloquially term "eigenstuff". What may help is if someone adds some of the physical significances of eigenvalues and eigenvectors, such as:
- The moment of inertia tensor always produces orthogonal eigenvectors which correspond to principal rotational axes about which the equations of rotational motion become very simple. The eigenvalues are the scale factors of the moment of inertia equation in such a case.
- A matrix which describes an approximation of a conjugated pi system (The Hueckel Approximation, if anyone is curious) often appears as a determinant, and setting it equal to zero yields the eigenvalues (allowed energies) and the eigenvectors (normalized but not necessarily mutually orthogonal in whatever vector space it spans, not that chemists care about the vector space involved) yield the contribution of each atom in the pi system to the associated molecular orbital represented by the eigenvalue.
- In heat transfer the eigenvectors of a matrix can be used to show the direction of heat flow. I am not entirely sure in this case what the eigenvalues represent. My thermodynamics didn't extend to matrix versions of the Fourier equation, so... (???)
- Someone mentioned stress-strain systems where the eigenvectors represent the direction of either greatest or least strain or stress, depending, IIRC, on the way the matrix is set-up. Again, not having done engineering in detail I can't be more specific.
- More generally someone mentioned to me once that in spectroscopy you can set up a matrix which represents the molecule under study, and in Infrared in particular, because it effectively acts as a series of coupled springs which vibrate at discrete energies, the eigenvalues here give the energies (wavenumbers) for the peaks and the eigenvectors tell what kind of motion is happening. Again, without quantum spectroscopy in detail I can't be more specific than this.
I would like to encourage Wikipedians in general, if they are editing math-specific entries, to try and relate the significance, either mathematically, or physically, of the concept or method in less abtruse language than is typical in linear algebra texts. :)
--142.58.101.46 4 July 2005 16:23 (UTC)
-
- Well, I'm a math person, and I find the majority of what you just said far more "abstruse" than the math jargon. It's all subjective!! 216.167.142.30 20:40, 23 October 2005 (UTC)
- Be bold and start editing. I agree that sometimes mathematicians push abstraction to the point where nobody else understands what is going on. Oleg Alexandrov 5 July 2005 02:36 (UTC)
I think the mathematics and the physics belong in seperate articles. The scope of this one is just too large. Perhaps the maths could go in "Spectrum (Linear Algebra)", the physics applications in "Eigenvalues and Eigenvectors in Physics", and other applications in "Applications of eigenspaces outside physics"? From a mathmatical point of view, the chain of associations sort of goes Linear Operators (-> Spectra) -> Differential Operators -> Differential Equations (-> Solutions) -> Applications, while in physics it goes Eigenstuff -> {Energy states, Principal axes, Modes, Observables, ...}. I think that combining these into one article will make it hard for physics or mathematics readers (or both) to navigate and stay motivated.
Also, the categories and names in physics and mathematics are quite different. Physicists think of rotations in R^3 and observables in Hilbert space as very different things, but to mathematicians they're all linear operators, though the observables are a special type called Hermitian. R. E. S. Polkinghorne
Eigenfunctions belong here too?
Yes? No? --HappyCamper 06:12, 11 August 2005 (UTC)
Singular Values ?
A reference to singular values and singular value decomposition could be of interest too ? --Raistlin 16:16, 13 August 2005 (UTC)
Table of contents
Oleg Alexandrov's edit summary (full disclosure: complaining about my subsubsubsection):
- I'd rather not have this as a subsubsubsection. That could be correct, but does not look good in the toc
I agree that it looks bad in the TOC, but it seems wrong that we should be compromising the article in order to improve the æsthetics of a standardised TOC. Has this problem been solved satisfactorily elsewhere on WP? (The obvious suggestion would be an option for subsubsubsections (or whatever) not to be mentioned in the TOC at all.) —Blotwell 09:26, 18 August 2005 (UTC)
- There's a (very short) discussion about the possibility of limiting the table of contents to display only up to a certain level of headings at Wikipedia talk:Section#Choosing TOC level, but not much more than wishful thinking at the moment. I agree with User:Blotwell that we shouldn't disrupt structural tags in the article in order to make the table of contents look prettier. —Caesura(t) 18:05, 7 December 2005 (UTC)
Projections
projection onto a line: vectors on the line are eigenvectors with eigenvalue 1 and vectors perpendicular to the line are eigenvectors with eigenvalue 0.
Isn't that true only of orthogonal projections? Consider
Doesn't that qualify as a projection onto a line? Yet it has no eigenvectors perpendicular to the line. Josh Cherry 14:35, 20 August 2005 (UTC)
- You are right, I fixed it.--Patrick 09:16, 21 August 2005 (UTC)
intuitive definition
the definition as given is, i think, unnecessarily hard to visualize. i would say that an eigenvector is any unit vector which is transformed to a multiple of itself under a linear transformation; the corresponding eigenvalue is the value of the multiplier. that is really clear and intuitive. the fact that a basis can be formed of eigenvectors, and that the transformation matrix is diagonal in that basis, is a derived property. starting off with diagonal matrices and basis vectors is unnecessarily complicated. ObsidianOrder 22:49, 21 August 2005 (UTC)
P.S. that way, eigenvalues and eigenvectors in infinite-dimensional spaces are not an "extension" either, they just follow from the exact same definition. ObsidianOrder
- I agree. Josh Cherry 01:26, 22 August 2005 (UTC)
- Yes.. the idea of forming a basis of eigenvectors does NOT belong in the intro. Here's an idea -- perhaps there should not be an intro? Perhaps the first part of the definition, the non-formal, geometrical part that you describe, can serve as the intro? Or.. if the format that needs to be followed is that there must be an intro, then nevertheless it should go from less technical to more technical, so the intro should have your non-formal geometric definition and in the definition section we can put the formal definition. The whole thing about diagonalizing a matrix should be mentioned, but LATER. Kier07 03:00, 22 August 2005 (UTC)
-
- I think the most intuitive idea of eigenvalues and eigenvectors comes from principal component analysis, and is reflected in the main applications. You want to express the transformation in terms of invariant axes, which correspond to eigenvectors, and how these contract or expand corresponds to the eigenvalues. --JahJah 02:56, 22 August 2005 (UTC)
-
- Perhaps. Here's another idea... would it be possible to have a visual for the intro, instead of this verbal nonsense? We could give a linear transformation from R^2 --> R^2, and show in a picture what this does to a sampling of vectors -- including invariant axes. Kier07 20:35, 22 August 2005 (UTC)
-
-
- At the moment we're giving the mathematical coördinate-free definition in terms of linear transformations, which obscures the connection with many of the applications. Often matrices don't represent transformations, but bilinear forms—this is when they have two covariant/two contravariant indices rather than one of each, if you're into tensors—and we should point out that eigenvectors are physically meaningful in this case too. This is where all the applications that give you symmetric matrices come from. (There is a relation between transformations and bilinear forms, of course, but it wouldn't be helpful to go into it in the introduction.)
-
-
-
- Following on from this, here's the idea I had been thinking about for an introductory visual: if you have a symmetric matrix and apply it to the unit sphere, then you get an ellipsoid whose principal axes have direction corresponding to the eigenvectors and magnitude corresponding to the eigenvalues. (Check: the product of the magnitudes is proportional to the volume, which is right because the product of eigenvalues is the determinant which is the volume scale factor.) This should be easy/fun/illuminating to illustrate. —Blotwell 05:50, 23 August 2005 (UTC)
-
motivation?
A little while ago I suggested a sentence in the introduction to give some motivation, something really simple to tell the reader why they should care. It was along the lines of:
- eigenvectors/eigenvalues etc allow us to replace a complicated object (a linear transformation or matrix) with a simpler one (a scalar).
This simple idea, expressed in very plain english, explains why we care about eigenvalues/vectors, without confusing the uninitiated with terminology like bases, diagonal entries, etc. The sentence was deleted at some point, and I feel the omission with some pain. Does anyone else agree that such a sentence would be a Good Thing? Dmharvey Talk 02:30, 23 August 2005 (UTC)
- Me, I agree. Though I'm no pedagogue. —Blotwell 05:52, 23 August 2005 (UTC)
Where eigenvectors come from
In the section on "Identifying eigenvectors", it was not immediately clear where the vector (1, 1, -1) came from. It turned out it was the solution for (A − λI)x = 0, and this turned out to be pretty easy. But it would be nice to explicitly state where to find the "sets of simultaneous linear equations" to be solved.
WillWare 23:14, 25 August 2005 (UTC)
Improving further
I have done a lot of work with respect to the intro and definition. I think some of the things I wrote are not optimized and are redundant with infos somewhere else on the page. Now I am tired and would like to see other authors contribute to a better structured article. 131.220.68.177 09:31, 8 September 2005 (UTC)
nonsense
Do we want to make any mention here about the fact that many disciplines have started to use "eigenvalue" as a buzz word? I was talking to a philosophy professor recently, a man with numerous published books, and when he began talking about "eigenvalues" I asked him how he was using the term. He was completely taken aback, and finally admitted he had no idea what the term meant, but it was in common use in his field.
- Please do, with examples. The abuse of "parameter" may finally be dying; let's kill this one now. Septentrionalis 02:11, 14 September 2005 (UTC)
-
- You're kidding. Wow. I've never heard of arts-type disciplines using the term "eigenvalue". I'll have to ask the philosophy students I know if their professors do this. I think the overuse of the term may come from the fact that it shows up so much in many sciences (for valid reasons). --24.80.119.229 18:43, 18 September 2005 (UTC)
-
-
- Eigenwert which is German for eigenvalue has a broader sense than in mathematical English. It can be also translated as distinct value see discussion in German at [1] because eigen is a very common prefix in German and can be used in a very broad sense. In that sense one could say The eigenvalue of the president's speech was more formal than objective. See for example Eigenwert des Ästhetischen at [2]. So sometimes, the term eigenvalue can be used in this sense for example:
- Thus one can say that research and therefore science fulfills a function and thereby reproduces a stable eigenvalue of modem society. One cannot simply refrain from research without triggering catastrophic consequences -- catastrophe understood here as the reorientation towards other eigenvalues.[3]
- I am no philosopher but as far as I understand this. It means that science is a value of a modern society which makes it clearly distinct from a more archaic society. Vb 09:58, 19 September 2005 (UTC)
- The paragraph above is a translation into very unidiomatic, and almost unreadable, English; the whole text almost reads like a Babelfish exercise. The use of the
claquecalque "eigenvalue" in this context is simply an error, as is "reproduces". Eigenwert would have been acceptable, so would "distinctive value". In short, Vb has given a diagnosis, not a justification. Septentrionalis 17:28, 19 September 2005 (UTC)- I presume that you meant calque. Josh Cherry 00:06, 20 September 2005 (UTC)
- I utterly agree this is a diagnosis not a justification. One needs more examples of this kind of usage to begin some article/section/disambig on this. Vb81.209.204.158 16:32, 20 September 2005 (UTC)
- The paragraph above is a translation into very unidiomatic, and almost unreadable, English; the whole text almost reads like a Babelfish exercise. The use of the
- Eigenwert which is German for eigenvalue has a broader sense than in mathematical English. It can be also translated as distinct value see discussion in German at [1] because eigen is a very common prefix in German and can be used in a very broad sense. In that sense one could say The eigenvalue of the president's speech was more formal than objective. See for example Eigenwert des Ästhetischen at [2]. So sometimes, the term eigenvalue can be used in this sense for example:
-
a duplicate/related article
There's a recently spawned article Symbolic computation of matrix eigenvalues which I don't much like; it seems to be a spinoff of this project, maybe? It needs attention or cleanup or something. linas 01:01, 14 September 2005 (UTC)
- Yes indeed. This info was previously within this article. I put it in a separate article because I thought this info maybe interesting for grad students fighting with matrix diagonalization but not for someone interested in the general topic of this article. I think this is the same for eigenvalue algorithm. Maybe the good idea would be to merge Symbolic computation of matrix eigenvalues with eigenvalue algorithm. Vb 10:13, 14 September 2005 (UTC)
Featured article?
I think now the article is getting quite good. However I don't think this is enough for getting featured. I think the following points have to be addressed before submitting for peer review:
- Copy edit : I think the English is not the best. I am no native speaker and I cannot manage it. That's the reason why I put the copy-edit flag on the page.
- Too technical : Some info on the page are too technical. In particular many properties are stated without explaining the reader why they are important and in which context they are used. In particular, the sections Decomposition theorem, Other theorems, Conjugate eigenvector, Generalized eigenvalue problem and Eigenvalues of a matrix with entries from a ring should be re-written to be more accessible to a broader audience.
I think much of the info here is good and interesting but sound a bit too much like a grad book for mathematicians. Vb 16:16, 21 September 2005 (UTC)
Problem with one of the examples
The following passage occurs as an explanatory example:
If one considers the transformation of the rope as time passes, its eigenvectors (or eigenfunctions if one assumes the rope is a continuous medium) are its standing waves well known to musicians, who label them as various notes. The standing waves correspond to particular oscillations of the rope such that the shape of the rope is scaled by a factor (the eigenvalue) as the time evolves.
A string of random length stretched at random tension can produce a sound that corresponds to a particular frequency. But depending on various factors the string can perform a simple vibration in which the entire length of the string moves in the same direction at the same time, or it can produce more complex vibrations (like the one illustrated in the movie). The musical qualities of these various possible variations are emphasized in the reader's mind by the mention of musicians because a pure sine wave sound vibration is not musically beautiful. Not only will the reader's mind potentially be sidetracked by that line of thought, but it is not exactly true that a musician will label any frequency as a "note" -- particularly if the musician has perfect pitch and the frequency being prduced is somewhere midway between a A and an A flat. I don't want to change this passage without being aware of what the original writer was trying to accomplish. P0M 05:37, 23 September 2005 (UTC)
- I am no musician. This example I wrote is a typical physicist's example which has not much to do with music. Here the objective was only to provide an example of eigenvector in an infinite dimensional space which could be understood by nonspecialists. If someone is able to make a better link with music, I would be very happy if he could do so. I would learn something! However I think it is important to keep in mind that one should not go into deep details here. This is not the place to present the whole theory of vibrating strings. Vb07:51, 23 September 2005 (UTC)
If you weren't trying to say something tricky about higher harmonics, Martin guitar vs. Sears guitar, etc., then how about:
If one considers the transformation of the rope as time passes, its eigenvectors (or eigenfunctions if one assumes the rope is a continuous medium) are its standing waves -- the things that, mediated by the surrounding air, humans can experience as the twang of a bow string or the plink of a guitar . The standing waves correspond to particular oscillations of the rope such that the shape of the rope is scaled by a factor (the eigenvalue) as the time evolves.
P0M 15:11, 23 September 2005 (UTC)
- Yes that's true I had also higher harmonics in mind. However, I believe it is not worth to tell more than a line about it or maybe just a link. What you wrote is from my point of view well done. I would replace "the things that," by "which" but my English is for sure not as good as yours. My problem is also that I am a bit afraid of telling nonsense about music: I have no idea about Martin and Sears guitar. But if you do why not making a footnote about such details: they are for sure interesting (at least to me) even if they would disturb a bit from the main topic of this article. Vb09:04, 24 September 2005 (UTC)
I made the basic change. If something were to be added maybe we could say something like, "Waves of various degrees of complexity may be present, analogous to anything from the dull plunk of a cigar box string instrument to a chord from a Stradivarius violin"? (Martin guitars are not quite that good, and certainly not that expensive anyway. Sears used to sell cheap guitars that hardly "sang" at all -- sorry, that is definitely POV. ;-) How do the higher harmonics get set up in quantum situations? P0M 01:48, 27 September 2005 (UTC)
- In general, different harmonics are characterized by their number of nodes. The first has no node. In the example of the movie above this is the fourth harmonic. In the hydrogen atom problem, this is similar. For quantum numbers l=0 (label s), each label n numbers of nodal surfaces and therefore corresponds to higher harmonics. For more details look at spherical harmonics. The complexity comes from the fact that the wave is 3D but I guess this must be a bit like the patterns formed within a spherical wind instrument Vb 09:30, 30 September 2005 (UTC)
Involutions and projections
I am very sorry but I don't see why all that info about involutions and projections belong here. OK they have eigenvalue -1, 0 or 1. And?... Do you really think it must be put on the same level as Spectral theorem or Applications? I believe not. If nobody provides here any convincing argument, I'll remove this section quite soon. Vb 17:57, 10 October 2005 (UTC)
- ==Linear involutions and projections==
Involutions, that is transformations such that , and projections, that is transformations such that , if they are linear, have the following simple properties
- The eigenvalues of an involution can only be 1 or −1.
- The eigenvalues of a projection operator can only be 0 or 1.
Such kind of transformations are ubiquitous in mathematics and applied disciplines.
Projections exist in pairs of which the sum is the identity, with a corresponding pair of involutions: plus and minus their difference. Conversely, the mean value of an involution and the identity is a projection.
The simplest examples are the identity () which is an involution as well as a projection, the inversion () which is an involution, and the zero operator () which is a projector.
If the elements v are functions defined on a set A, then other examples of involutions are those that, for a given subset B of A, negate the function value at B, and keep it fixed at the rest of A. Similarly, examples of projections are those that, for a given subset B of A, make the function value at B zero, and keep it fixed at the rest of A.
In the finite-dimensional case involutions and reflections are represented by diagonalizable matrices, similar to diagonal matrices where, as follows from the above, all the values on the diagonal (the eigenvalues) are 1 and/or −1 for involutions and 1 and/or 0 for projections.
change of title
"Eigenvalue, eigenvector, and eigenspace" has been moved to "Eigenvalue, eigenvector and eigenspace". I don't understand why. I am not a native speaker but I know that BibTeX use this rule for the coma when citing a paper with more than one author and that the editors of Phys. rev. A always use that rule for a list of more than two items. Though this change may be correct I believe it was not necessary. This change has some influence because many links to this page are getting now grandsons of the article instead of sons and this has influence! Could someone tell me whether one should keep the current spelling. If noobdy can I will make the mv back. Vb 09:00, 11 October 2005 (UTC)
- Why don't you ask User:MarSch who did the move (if I read the history correctly)? The second comma in "Eigenvalue, eigenvector, and eigenspace" is a serial comma, and, as that article says, opinion is divided on whether one needs it or not (I am not a native speaker either). -- Jitse Niesen (talk) 12:56, 11 October 2005 (UTC)
-
- Thank you Jiste for your explanation. I understood something: English is not that easy! :-) I don't do anything and let the native speakers quarrel about that! Vb 09:55, 12 October 2005 (UTC)
Mona Lisa
But what does that make the blue vector?
- the blue vector is not an eigenvector Vb 09:54, 14 October 2005 (UTC)
I think the last line of the text describing the Mona Lisa should read: "They, along with all vectors with same scale value parallel to the red vector, form the eigenspace for this eigenvalue"--69.118.239.204 03:18, 24 November 2005 (UTC)
Separating different parts
Could the mathematical and physical sections be a little more distinct? And also the linear/nonlinear. Much of the things about the linear case don't hold when you move to nonlinear differential operators, e.g. and it seems ridiculous to worry about qualifying everything. While technically the linear case is just a "special case" of the nonlinear case, this is a bit like saying analytic functions are a "special case" of R^2-R^2 real-differentiable functions, so why waste time on them. Most people will mean the linear case when they think of "eigenvector", and there is so much you can say in just the mathematical context of vector spaces and linear transformations. 216.167.142.30 20:40, 23 October 2005 (UTC)
- I don't understand your point. Many applications of the concept are non linear and the concept of eigenvectors can easily be introduced to lay persons in the most complex context. I believe the choice which has been made is more pedagogic. Many who know what an eigenvector is think it appears in linear algebra only (I was thinking this after my first lecture in algebra): this is untrue and I think this is one of the reason why the article was recognized as a featured article. You are speaking about a clearer distinction of physics and math. Physics appear in the examples and the applications only. So why do you bother? Vb 20:51, 23 October 2005 (UTC)
- Well, I have a ph.d. in math and had never heard of "nonlinear eigenvalues", so don't feel bad. My point is, when the article attempts to present a formal definition of things. It's one thing to introduce a touchy-feely idea of the concept to a general audience; it's another to try to introduce a formal definition later. My gripe is that the general intuitive concept is introduced as being inclusive of both linear and nonlinear, but then when the formal definition comes around, the linear definition is given, and then later on down, it's mentioned "oh, yes, it can be nonlinear" but no definition of what this might mean is given. So, as it stands, as a mathematician, I have been made aware of the fact that such a thing as "nonlinear eigenvalue problems" exist, but I have no idea what they are or what they mean. Does this makes sense? 216.167.142.30 21:02, 23 October 2005 (UTC)
- I'm beginning to wonder if the article isn't becoming too technical altogether. Compare to fractal. At that article, after a great deal of work and compromise, the article settled into a general readable form for non-math or non-science types, but without avoiding or sidestepping important issues relating to math or science. Most of the technical details about theory and applications got shuttled off to other articles (e.g. those defining dimension, relationship to chaos theory and dynamical systems, applications to science, etc, etc) The current article for EEE seems to start off attempting to emulate the fractal article, but then quickly goes into a myriad of technical issues, even algorithm ones (how to compute complex eigenvalues), and the general big picture seems to be lost. There is more than enough relatively non-technical stuff to talk about to fill out a whole article, without avoiding technical considerations, and there is certainly enough to fill out several technical articles (e.g. the theoretical presentation of EEE in the sole context of vector spaces and linear transformations; the practical and matix algorithms in finite-dimensional case; the application to linear differential equations; the nonlinear eigenvalue problem; applications to physics; etc, etc) ALL of these could form separate articles. As it is, each of these has a "stub" here, so the article reads like several stubs following each other. 216.167.142.30 20:58, 23 October 2005 (UTC)
- I suppose my point is, the term "EEE" have probably achieved a certain cultural currency outside of mathematics proper, in the way that "fractal" has achieved a cultural currency outside of mathematics proper. So, in these cases, the main article for such a term (such as "fractal") will have to address the large non-math audience, while other articles address other things. This article will have to address the large non-math audience, but as it is, there is no article set up anywhere for writing in a style for a mathematical audience. At the least, the article starts talking to one audience, and quickly starts talking to another, and ends up succeeding with neither. 216.167.142.30 21:06, 23 October 2005 (UTC)
Well I have a PhD in Physics however I hate authoritative arguments. The point is not whether non linear transformations are important but whether one needs linearty to define the concept of eigenvalue. I think not. In fact I believe the picture of Mona Lisa is enough to explain what it is and that the definition T(v_l)=l*v_l is general enough without assuming anything about the nature of T. However what you say is not true: many non linear transformation are common and their eigenvalues are important quantities. In the example to the vibrating string, the equations of movement must not be linear. If one consider transformations corresponding to time-evolution operators in dynamical systems, the equations of movement are in general not linear. In the case of quantum mechanics, studying the motion of an open subsystem leads also to nonlinear equations of motion. The study of the eigenvalues of the corresponding operators is a very broad field of research. About the level of mathematical and technical details: I think many students are exactly looking for this kind of information. Your next criticism is about the stub character of the article. You forget that the article has some daughter pages like eigenvalue algorithm, eigenface, etc... However if you want to rewrite this article, please be bold! But please have a look at the comments on the FAC page and try to keep the expressed remarks into account. Both of us are not good judges for deciding what laymen in maths are expecting from this article. The remarks expressed there could provide a kind of red line for this. Vb 16:00, 25 October 2005 (UTC)
- Vb, it's also not quite clear to me what is meant with nonlinear eigenvalues. I suppose that even if T is a nonlinear operator, you can define λ to be an eigenvalue whenever T(v) = λv. But can you explain in more detail why such a definition would be useful? -- Jitse Niesen (talk) 16:46, 25 October 2005 (UTC)
Something missing
The article currently has a group of words that appears to be intended as a sentence:
Sometimes, when \mathcal{T} is a nonlinear transformation, or when it is not possible or difficult to provide a basis set.
All that is there is a "when" clause.
Assuming that this is the desired clause and it's the end of the sentence that is missing, then it would be better to change "or difficult to" to "or is difficult to." P0M 06:21, 26 October 2005 (UTC)
blue arrow in mona lisa
I would prefer if it was horizontal in the left picture and then in the transformed picture it would be parallel to the top and bottom sides still. This would be slightly clearer since it is easier to see what exactly happened. --MarSch 09:51, 26 October 2005 (UTC)
perhaps include normal modes
I think one of the best example of the use of eigenvectors and values in finding normal modes for complex oscillations. Its a decently intuitive use of linear algebra. What do you think?
- True. Be bold : do it! Vb 09:34, 1 November 2005 (UTC)
notation
The use of bolding and script is non-useful. Vectors are not bolded, except by some physicists sometimes, but even they tire of it. Furthermore the script T for a simple transformation is odd. Notation should be as simple as possible without frills that don't mean anything. What do you think? --MarSch 11:33, 1 November 2005 (UTC)
- As far as I know (I am physicist) bolded vectors v are still standard even if I prefer the notation with an arrow . Here in this article, the notation v has been used for the vectors and v for their representation as vertical arrays in a particular basis set (an alternative notation used is ). The same has been done for the transformations opposed to their matrix representations T in a particular basis set (an alternative notation used is ). I am well aware one can forget about this nuance. I am however persuaded this is not pedagogic. In particular I think it is important to use distinct notations for the general definition and its representation in the linear and finite dimensional case T(vλ) = λvλ. Both objects are really different is a transformation, a function or a functional and T is an two-dimensional array of scalars. Using a common notation for both would be IMO misleading. Vb 12:43, 1 November 2005 (UTC)
I agree with MarSch.
Crust 21:21, 1 November 2005 (UTC)
- There is no reason to go out of our way and use calligraphic T here. The transformation and the matrix representation are equivalent (in certain basic circumstances), and the noncalligraphic T is only used once as to make a notational distinction rather useless. I am going to change these back when I have the time. Dysprosia 12:31, 16 April 2006 (UTC)
footnotes
I find the bottom two footnotes should be in the text and not footnotes at all.--MarSch 11:35, 1 November 2005 (UTC)
- There are three footnotes. The first one is a citation which has nothing to do with the article but only with the source of one picture. The second is there to precise which kind of transformation we are intending to speak about. This precision is very technical and IMO does not belong to the lead. The third one is about the fact that zero vectors are not considered as eigenvectors. These both very technical statements must be said but not so early in the article. Historically it replaced parentheses because some reviewers in the FAC pointed out that these comments were diminishing the text flow. Vb 12:58, 1 November 2005 (UTC)
-
- In parenthesis they would also be clumsy, but at least they would be in the right place. I think the fact that the transformation is from a vector space to itself is pretty essential to understanding the concept of an eigenvector. A transformation from one space to another would land you non-sense. The non-zero claim should be explained. Since the zero vector is always an eigenvector of a linear transformation it is excluded explicitly. The flow is now very bad, since when you see the note, you have to click it to see what it says and then back again, only to discover that the note should really be in the text. --MarSch 13:20, 1 November 2005 (UTC)
-
-
- Well it breaks the flow. You are right. But the ones who look at these are usually assumed to be expert and therefore able to cope with is inconvenient. But specifying things which are relevant to experts only in the body of the text is also breaking the flow. I think the Mona Lisa pictures shows which kind of transformations we are speaking about. Moreover in the non linear case zero vectors could be also eigenvectors even if one does not really need to mention this in this article. -- Vb 14:03, 1 November 2005 (UTC)
-
-
-
-
- Aah, the nonlinear zero eigenvector izzz convincing. Thanks. I'm sticking in the other though. --MarSch 13:02, 3 November 2005 (UTC)
-
-
german term
This sentence seems odd: "Today the more distinctive term "eigenvalue" is standard in the field, even in German, which used "eigenwert" in the past." If you click on the German version of this article, you discover that it never uses the term "eigenvalue", only "eigenwert". So is the German version out of date, or is the English version wrong in claiming that Germans actually say "eigenvalue"?
- I have changed this. Eigenwert is still the correct German term for eigenvalue. However some Germans may use the word "eigenvalue" but this is definitively not correct. Vb 12:49, 1 November 2005 (UTC)
- You are both right to some degree. Eigenwert is the correct term in German and this will not change in the future AFAIK. However, the huge majority of serious scientific publications in Germany (above undergraduate niveau) is written in English nowadays.
eigenvalues of non-linear transformations?
The article defines eigenvalues for an arbitrary transformation T that is not necessarily linear. Is this a common usage? I have only ever encountered the terms eigenvalue, spectrum, etc. with linear operators. It looks like all the examples in the article refer to linear operators.
Crust 15:18, 1 November 2005 (UTC)
- The point is not whether eigenvalues of non linear transformations are important but whether one needs linearity to define the concept of eigenvalue. I think not. However many non linear transformations are common and their eigenvalues are important quantities. In the example to the vibrating string, the equations of movement must not be linear. If one consider transformations corresponding to time-evolution operators in dynamical systems, the equations of movement are in general not linear. In the case of quantum mechanics, studying the motion of an open subsystem leads also to nonlinear equations of motion. The study of the eigenvalues of the corresponding operators is a very broad field of research. Vb 15:35, 1 November 2005 (UTC)
-
- Can you expand a bit on this subject? Take for instance the string. It can indeed be modeled by a nonlinear equation. But what do you mean by eigenvalue in this case and why is it important? It would be very helpful if you could give some references.
- By the way, I don't understand the caption of the picture of the standing wave: "… In this case the eigenvalue is time dependent." Why is this? What is the equation you are using to model the rope? -- Jitse Niesen (talk) 17:44, 1 November 2005 (UTC)
-
-
- Vb, sure the definition of eigenvalue is coherent even if T is non-linear, but that doesn't necessarily mean that it makes sense to allow T non-linear (or more to the point, that at least a non-trivial minority of mathematicians allow this). It seems to me that in the nonlinear case, very little of interest is going to carry over. For instance, let R be the real numbers and consider T:R->R given by T(x) = x^2. Then T has a continuum of eigenvalues, one for each positive real, which seems pathological for an operator on a one-dimensional space.
- By the way, the time translation operator in the string example is linear. We actually have a time translation operator T_s for each real number s with T_s(f) for f=f(x,t) given by T_s(f)(x,t) = f(x,t+s), which is obviously linear in f. The answer to Jitse Niesen's question is that the eigenvalues of the operators T_s do depend on s (which is not at all surprising).
- Crust 19:50, 1 November 2005 (UTC)
- I am very sorry Crust but I don't understand what you mean. In my opinion, in the sting case T_t is non linear if T_t(f+g)!=T_t(f)+T_t(g). This is usually the case for real ropes. One example: high harmonics are much faster damped than low harmonics, thus after a long enough time T_t(f+g)=T_t(f) if f and g are low and high harmonic signals respectively. This is the reason why when you pick a guitar string you usually produce only the lowest harmonic vibration (because all higher harmonics present in the initial signal are damped very early). Vb 12:26, 2 November 2005 (UTC)
-
-
-
-
-
- Crust, I see what you mean. The eigenvalues of T_s indeed depend on s, but not on t. The operator T_s is linear if one assumes that the string is modelled by the wave equation
- However, more realistic models yield nonlinear equations, and hence a nonlinear time-translation operator. -- Jitse Niesen (talk) 14:07, 2 November 2005 (UTC)
- Crust, I see what you mean. The eigenvalues of T_s indeed depend on s, but not on t. The operator T_s is linear if one assumes that the string is modelled by the wave equation
-
-
-
Well Jitse I have thought a lot about nice examples of nonlinear cases. I think two examples of eigenvectors are the molecular orbitals (eigenvectors of the Fock operator) and resonances as defined by Fesbach and Fano (eigenvectors of the effective Hamiltonian in Feshbach-Fano partitioning). In both case the nonlinear problem can be reduced to an iterative sequence of linear problems (the SCF algorithm). I think also eigenvectors of nonlinear operator are important properties of nonlinear dynamical systems in the neighbourhood of stability islands in the phase space. However, the main point why I have introduced the concept of eigenvectors in the most general case is not because of those advanced topics but because I didn't want to explain the reader what is a linear transformation or a basis set before explaining the basic concept of eigenvector. The usual definition of eigenvalues and eigenvectors comes as an advanced topic in linear algebra after defining all the concepts like basis sets, matrix, etc... Coming from such a definition, eigenfunctions would have appeared as a generalization and not like a direct application of the concept. Vb 09:11, 2 November 2005 (UTC)
- I am not familiar with either the Fock operator or the Feshback-Fano partitioning. However, as far as I can see from the Wikipedia articles on this topic, all operators are still linear. I think most of the foundations of quantum mechanics would break down if the Hamiltonian were not a linear (and even self-adjoint) operator. Are you talking about the Hamiltonian being nonlinear, or some other operator? -- Jitse Niesen (talk) 14:07, 2 November 2005 (UTC)
- First both operators are not fundamental to quantum mechanics. The Fock operator is used as a first approximation introduced to produce a basis set (the Slater determinants made of molecular orbitals). The Feshbach-Fano Hamiltonian are introduced to deal with sub quantum systems. The Fock operator is explicitly depending on the function it is applied to. F(phi) is not simply a matrix multiplication. In the Feshbach-Fano partitioning case, H_eff depends explicitly of the eigenvalue E and the equation Heff(E)ΨE = EΨE is a non linear equation. Vb 14:46, 2 November 2005 (UTC)
- I will assume that you're right for the Fock operator. However, in the equation Heff(E)ΨE = EΨE, something else is happening, in that the operator depends on the eigenvalue. Such equations are indeed studied, the simplest one being the so-called quadratic eigenvalue problem λ2Av + λBv + Cv = 0, where A, B, and C are matrices. But these equations are not of the form T(v) = λv. In dynamical systems, one looks at eigenvalues of the linearized operator (at least, in all cases I'm familiar with). -- Jitse Niesen (talk) 20:51, 2 November 2005 (UTC)
- First both operators are not fundamental to quantum mechanics. The Fock operator is used as a first approximation introduced to produce a basis set (the Slater determinants made of molecular orbitals). The Feshbach-Fano Hamiltonian are introduced to deal with sub quantum systems. The Fock operator is explicitly depending on the function it is applied to. F(phi) is not simply a matrix multiplication. In the Feshbach-Fano partitioning case, H_eff depends explicitly of the eigenvalue E and the equation Heff(E)ΨE = EΨE is a non linear equation. Vb 14:46, 2 November 2005 (UTC)
- I can see your pedagogical point about not wanting to explain the concept of "linear transformation" (by the way, I don't see why you need to talk about "basis sets"). But that would also argue against mentioning "nonlinear". I don't understand what you mean with "Coming from such a definition, eigenfunctions would have appeared as a generalization and not like a direct application of the concept." If you come from the definition: an eigenvalue is a λ for which there is a nonzero v such that Tv = λv, then eigenfunctions are a direct application, aren't they? -- Jitse Niesen (talk) 14:07, 2 November 2005 (UTC)
- When I was preparing the Mona Lisa picture I have tried different transformation avalaible on my graphic program (xv). I found out that most transformations avalaible were nonlinear so I think that nonlinear transformations are something pupils can meet simply by trying out special graphic effects. Of course we can omit nonlinear in the list of examples of transformation but I don't know whether this helps anybody. Of course you are right when you say Tv = λv can be used as well for function: if T is not defined as a matrix of course as it is usually the case in standard introductions to the topic. Vb 14:46, 2 November 2005 (UTC)
- Omitting nonlinear would mean that Wikipedia does not contradict all the standard texts; it seems obvious to me that that is a good thing. Of course, readers can try out nonlinear transformations, but I still think that "eigenvectors" for nonlinear transformation are by far less useful than in the linear case, and that even calling them eigenvectors is an abuse of language. -- Jitse Niesen (talk) 20:51, 2 November 2005 (UTC)
- When I was preparing the Mona Lisa picture I have tried different transformation avalaible on my graphic program (xv). I found out that most transformations avalaible were nonlinear so I think that nonlinear transformations are something pupils can meet simply by trying out special graphic effects. Of course we can omit nonlinear in the list of examples of transformation but I don't know whether this helps anybody. Of course you are right when you say Tv = λv can be used as well for function: if T is not defined as a matrix of course as it is usually the case in standard introductions to the topic. Vb 14:46, 2 November 2005 (UTC)
Vb, in your guitar string example I think T_t(g) is very close to zero for large t. So the fact that T_t(f+g) is very close to T_t(f) for large t is a consequence of linearity, not a counterexample. I don't know much physics, so I certainly can't speak with authority, but it looks like the Fock operator is also linear (the fact that is a redirect to Fock matrix would certainly seem to suggest this). Crust 22:04, 2 November 2005 (UTC)
Does anyone (other than Vb) think it is appropriate to allow non-linear functions in the definition of eigenvalue? I think that Jitse Niesen's comment above is exactly right; I have never seen this elsewhere. Can anyone find a source that defines eigenvalue this way? As an aside, let me say thanks to Vb for putting a lot of work into this article. Crust 15:45, 3 November 2005 (UTC)
I thought about that and I think Crust and Jitse are right. Let's get rid of non linear but also I think it would be a bad idea to limit from the begining on this concept to linear transformation because, in principle, we don't need the concept of linearity for defining eigenvalues. Vb 12:45, 4 November 2005 (UTC)
Now that even Vb semi-agrees, I've gone ahead and cleaned up the article on this point. Crust 16:04, 4 November 2005 (UTC)
- No I don't agree with Crust's change. Though I really understand that eigenvectors of nonlinear transformations don't need to be discussed here there is also no reason to introduce the concept of linearity before it is necessary to do so. Most basic concepts related to eigenvalues and eigenvectors can be understood without understanding what is a linear transformation. Vb 16:48, 4 November 2005 (UTC)
Vb, the point isn't that "eigenvectors of nonlinear transformations" don't need to be discussed; the point is that that is an oxymoron. I agree that for a non-mathematical reader linear transformation is a little more intimidating than transformation (which is itself probably already intimidating), but I think the first priority needs to be write an accurate article. Readability and accessability to a general audience are important, but we have no business writing things that are just plain wrong. An alternate way to phrase it might be to say "In linear algebra, a branch of mathematics, ..." and then use the word transformation with the link actually pointing to "linear transformation". I'm not sure which is the less intimidating way to do it.
Crust 18:33, 4 November 2005 (UTC)
OK, I've tried another wording in an attempt to please Vb (= 131.220.68.177). It occurs to me that readers who find linear transformation intimidating will probably also be intimidated by vector space, so I've avoided both. Crust 19:37, 4 November 2005 (UTC)
- As you've seen I reverted your last edits. I don't think linear transformation is intimidating. I simply think it is useless to define what is an eigenvector. The article is not a subarticle of linear algebra. If you look at the history of this article you will see that the original featured article was referring to vector space in the lead throug a footnote only. The original idea of this article was (see discussion above) to merge eigenvectors, eigenfunctions, eigenstates and other things into one single article. My idea was to merge all those things under the abstract concept of eigenvector by referring to general transformations only because I think this abstract concept is in fact easier to explain that the concept of matrix diagonalization. Well all in all this article is not mine and if you really want to change it utterly from its featured version: I will not begin an edit war and, well, let the article die. That's it: wikipedia is a living organism its articles may live but also die. Vb 08:42, 7 November 2005 (UTC)
Vb, please note that eigenfunction and eigenstate are just special words for eigenvector in specific contexts (an "eigenfunction" is an eigenvector of a linear transformation on a function space, and "eigenstates" are eigenvectors of certain linear operators in quantum mechanics). In all cases, the operators involved are linear. When we talk about the continuous spectrum, again we are talking about a linear operator. E.g. d/dx is a linear transformation that acts on the vector space of infinitely differentiable functions. The eigenvectors ekx are also called eigenfunctions since they are functions and the spectrum of d/dx is continuous (it is the whole real line). There is nothing non-linear about this situation (perhaps you are somehow confusing the terms "non-linear" and "infinite-dimensional"?). Vb, you seem to have your own personal definition of eigenvalue, spectrum, etc. that includes non-linear operators. If I (and Jitse) are wrong and there is some non-trivial community that uses these words the same way you do, OK, but please provide references (and even if that is the case, we will need to note that most authors do not do this). It would be pretty crazy to label a math article (and a featured article to boot) with an Wikipedia:Accuracy dispute tag, but if we can't come to agreement on this I don't know what else to do. Crust 22:05, 7 November 2005 (UTC)
- It seems that Vb agrees with not mentioning nonlinear operators, as s/he wrote in an edit comment today "I really agree with not mentioning nonlinear". Vb does not want (and neither do I or you) to restrict to finite dimensions. -- Jitse Niesen (talk) 22:44, 7 November 2005 (UTC)
-
- I hope you're right that Vb now agress with restricting to the linear case. I fixed the definition in the first sentence of the article to reflect this. Crust 23:06, 7 November 2005 (UTC)
Well, I think I have an argument which could make you understand my physicist point of view. When you look at the spectrum of an atom, more simply a vibrating rope, or a music instrument, you observe peaks in the cross section, oscillating shape scaled by a factor as the time evolves, production of particular sound frequencies (Helmholtz' Eigentoene). You don't mind whether the response function of your atom, your rope or your music instrument is linear. You simply observe that -- and this is particularly clear in the case of the rope -- the produced signal is simply scaled by the time evolution operator and is therefore an eigenfunction of it. This operator is in general non linear. Of course it can usually be linearized in the neighbourhood of the its eigenfrequencies but I think this is very technical. I don't claim that nonlinear aspects should be discussed in this article I just claim that linearity has nothing to do with the definition of eigenvalue, eigenvector, eigenspace, and geometrical multiplicity. Vb 10:14, 8 November 2005 (UTC)
Please provide a reference that defines eigenvalues for non-linear operators. If you can find such a reference (preferably by a link to a website, otherwise by a page reference to a standard physics text, e.g. Cohen-Tannoudji), then we can note both definitions, i.e. that some authors allow T to be non-linear, but most do not. Otherwise, we must stick to the standard definition. Sure, you would personally like to allow T to be non-linear. But what would you say to someone else who might want to allow λ to be an operator instead of just a scalar, etc., etc.? This is supposed to be an encyclopedia not a personal webpage. Crust 17:16, 8 November 2005 (UTC)
- I had not looked seriously yet for non linear eigenvalue problems because I think they don't belong to such an introductory article. My point was just a point of hierarchy of concepts. The concept of eigenvalue is in my opinion not dependent on the concept of linearity and doesn't therefore to refer to it. Of course Cohen-Tannoudji's book does not speak about it because such kind of theories are far more advanced. However, in order to answer your request, I search with google for nonlinear operator eigenvalue and found many links. Including http://etd.caltech.edu/etd/available/etd-09252002-110658/ or http://arxiv.org/abs/physics/9611006. Well I agree those references are not autoritative but I really think it is not the point. I don't want to make a research on this. I just want to point out that linearity is a concept which has nothing to do with eigenvalues and that both concepts should not be mixed. If you are interested in this try to do some google as I did. You will be astonished how many references you'll find. Vb 10:14, 9 November 2005 (UTC)
-
- The concepts of eigenvalues, eigenvectors and eigenspaces are really concepts from linear algebra. The related articles in all other languages (that I can read) mention linear algebra right at the beginning, and these [4][5][6][7] don't mind mentioning linear transformations explicitly. Hilbert's title [8] makes it clear that his usage is in a linear context. Hamiltonian (quantum mechanics) makes it clear that "H is a Hermitian operator." The sinusoidal standing waves in the rope example are eigenfunctions of the time differential operator, which is linear. Vb seems to be confusing this point - the "response functions" of the oscillator may be nonlinear, but the eigen- terms arise in this context specifically because the differential operator is linear. While there are undoubtedly uses of the term eigenvalue applied to nonlinear operators in the literature, I would guess they are mostly novel extensions of the concept. To exclude the mention of linear algebra and linear transformations in this article based on a dedication to these generalizations is hubris and/or original research. I also think it is silly to exclude linear algebra and linear transformation based on hypothetical readers being unfamiliar with the terms, and yet to include vector space, which is clearly defined as "the basic object of study in the branch of mathematics called linear algebra."Zander 13:34, 9 November 2005 (UTC)
-
-
- Vb, I had a quick look at the two references you cited. The first (a PhD thesis from 1968) does define the terms eigenvector and eigenvalue for an arbitrary (not necessarily linear) operator on a Banach space (although it restricts the term "spectrum" to linear operators). For the second, I think you again made the mistake discussed by Zander above of confusing what is nonlinear. For that paper, the eigenvalues are eigenvalues of the (linear) Hamiltonian. But all this isn't really the point. Finding a paper or two that (in my view anyway) abuses or extends conventional terminology for the author's convenience is not sufficient. If this really is a standard usage, it should appear in some standard reference. (Sure, Cohen-Tannoudji is an undergraduate text. I just mentioned it because you cited it as a reference and also I am/was familiar with it. If you find some advanced graduate text that defines eigenvalue this way that's obviously fine.) Crust 14:55, 9 November 2005 (UTC)
-
OK. 3 vs 1 I guess I am going into the minority. If everybody find silly to introduce only the necessary concepts before stating a definition. However I would like you to make google with "nonlinear eigenvalue problem", "nonlinear eigenvector", "nonlinear operator eigenvalue", etc... It is very surprizing to discover how many authors are interested in this question. I have even found a workshop which look very serious with the title "Mini-Workshop on Nonlinear Spectral and Eigenvalue Theory with Applications to the p-Laplacian" and they mention that one of the topic will be "During the Mini-Workshop we will discuss recent progress and open problems in the theory, methods, and applications of spectra and eigenvalues of nonlinear operators." This sentence is also interesting: "Although eigenvalue and spectrum analysis for nonlinear operators have been studied by many researchers. in mathematics literature, singular value ... " (Kenji Fujimoto, Nagoya University). "A new spectral theory for nonlinear operators and its applications" (W. Feng, Abstr. Appl. Anal. 2, no. 1-2 (1997), 163–183). "We study the asymptotic behaviour of the eigenvalues of a family of non-linear monotone elliptic operators of the form Ae = -div(ae(x, Ñu)) with oscillating coefficients." from "HOMOGENIZATION OF A CLASS OF NON-LINEAR EIGENVALUE PROBLEMS" presented at the "VII French-Latin American Congress on Applied Mathematics" by Mahadevan RAJESH and Carlos Conca. I could go on like that on a whole page. Of course I agree with Zander and Crust that the concept of eigenvalue comes from Hilbert's work on linear algebra. I also agree that students are tough about it during the course on linear algebra. But I have to oppose on one point: you don't need linearity to define eigenvectors. You need it when you talk about the spectral theorem but not to define the basic concepts. Vb 15:09, 9 November 2005 (UTC)
- Once again. I have googled a bit for "nonlinear Schroedinger equation" and "nonlinear eigenstate". Those things exist. In the context of Bose-Einstein Condensates. Look at this http://aleph.physik.uni-kl.de/~korsch/preprints/Graefe_quantph_0507185.pdf and Nonlinear Schrödinger Equation at EqWorld: The World of Mathematical Equations. Vb17:22, 9 November 2005 (UTC)
-
- Your most recent two physics examples appear off the mark. The BEC paper is the same issue as the Fock matrix discussion above. The Nonlinear Schrodinger Equation site doesn't even use any word starting with "eigen-"! However, your previous applied math examples look legit. Perhaps we could include this generalization as a new section/subsection (similar to entries from a ring, generalized eigenvalue problem, neither of which are really specific to matrices by the way). While we're at it, there are several other, I think more common, generalizations that we perhaps should mention (other types of generalized eigenvalue problems; pseudo-eigenvalues; generalized eigenvectors e.g. for eigenvalue 0, vectors that are annihilated by some power of an operator but not by the operator itself). Crust 18:07, 9 November 2005 (UTC)
-
-
- Well I don't think those two physics examples are off the mark. But from googling around I noticed that nonlinear eigenvalue problem are in fact so common that they could require a section of their own. However, I really don't feel competent for this. Mathematicians specialists of the topic would do much better than I would. Vb 09:51, 10 November 2005 (UTC)
-
-
-
-
- Vb, you've got a great sense of humor. You cite an article that doesn't even contain the string "eigen" and when challenged continue to insist that it involves eigenvalues of non-linear operators. Hilarious. The BEC situation is slightly more subtle in that it uses the phrase "nonlinear eigenstate", but if you actually look at the paper the "nonlinear eigenstates" are eigenvectors of linear operators, it's just a generalized eigenvalue problem (similar to the situation discussed by Jitse with reference to the Fock operator). I think it's pretty clear that the applied math literature on this is as discussed by Zander very much a novel generalization of the concept by a relatively small community. One symptom of this is they don't have a standard definition of "spectrum" with many competing definitions that give different answers for the same operator.
- I'm going to change "transformation" to "linear transformation" one last time. Feel free to research and start a generalization section covering non-linear operators (though like I said there are several other, probably more important, generalizations we don't currently cover). If you're still unhappy, I don't see any other solution than a disputed accuracy tag. Let me close by saying that you have put a lot of work into this article and I think it is therefore reasonable to show some deference to you on matters of taste, but not accuracy. This will likely be my last substantive post on this; I just don't have the bandwith to keep chasing these things down. Crust 18:56, 10 November 2005 (UTC)
-
-
Well. We don't agree. I still believe we simply don't agree on the form and not on the content. I don't want either to discuss nonlinear transformations. If you want to explain a friend of yours in five minutes what is an eigenvector or an eigenvalue: would you begin by explaining him what is a linear transformation? I don't think so: you will explain him what is a transformation and that eigenvectors are vectors which are just scaled during the transformation. I also have also other things to do. Vb 13:28, 11 November 2005 (UTC)
Who is the audience for this article?
This is yet another example of an article on a mathematical topic written by mathematicians for mathematicians.
Why not write in such a way that readers with the necessary pre-requisite knowledge can be led into the topic in a natural and friendly way?
The professional mathematicians have their own forums - typical Wiki readers come here to learn, not to be confronted with a highly technical approach which can be appreciated only by those who know it all anyway?
203.0.223.244 23:05, 1 November 2005 (UTC)
-
- In my opinion, this article provides just the sort of introduction to the general reader that the above comment says that it fails to provide. I am not a mathemetician, but I am interested in math and science and read non-professional (i.e., popular) literature in these fields. I have seen eigenvalue and eigenvector in my physics reading, but had no understanding of these terms. I found this article because it was featured on the WP Main Page (the editors of which obviously thought that it would interest a broad readership). Now I have some understanding, consistent with my limited mathematical education; those with more education will understand more. Great article! Finell (Talk) 01:44, 2 November 2005 (UTC)
-
- I find this article interesting precisely because it has the problem of explaining something so "hard-core-Math". I think the ideal in dealing with a deep specialized topic is to naturally sift people toward basis texts that are at the level they can understand. Expecting the article to explain eigen[X]s to people who aren't familiar with algebra would be impossible, but there should be some way of pointing people in the direction of more basic articles that they can read to build up the reequisite knowledge. Sending them towards ever-increasingly stratospheric articles is no help at all.
- The introduction is a place to really see if consensus can hash out something that fits the encyclopedic tone and yet isn't inaccessible to someone who only has a high-school level math education. I think this article is trying to do too much by lumping all the eigens into one, it's an interesting experiment nonetheless, but each term might have to be defined in its own sentence (especially to avoid the vague definition). The reliance on the figure in the intro is not good and I'm definitely opposed to referring to it from the text. Metaeducation 14:06, 2 November 2005 (UTC)
- You are right. The article before it got featured on the main page had reached the following equilibrium (watch the version before the 1st Nov and discussion at the Featured Article Candidate page of this article): Don't define at all eigenXs, say only they are important quantities im math and refer to the picture for an informal definition; the exact definition coming as first section. Yesterday as the article came on the front page one new editor said: "the lead doesn't define anything; this is bad style; I provide an informal definition; and she did". OK why not. In my opinion this was a good edit. You seem to disagree and prefer three distinct definitions in the lead. I think this too technical and doesn't correspond to the editorial line for math articles. Quotation: It is a good idea to also have an informal introduction to the topic, without rigor, suitable for a high school student or a first-year undergraduate, as appropriate. For example, "In the case of real numbers, a continuous function corresponds to a graph that you can draw without lifting your pen from the paper, that is, without any gaps or jumps." The informal introduction should clearly state that it is informal, and that it is only stated to introduce the formal and correct approach. If a physical or geometric analogy or diagram will help, use one: many of the readers may be non-mathematical scientists. Defining a complicated math thing in one sentence and without picture is a very hard task. I thought the figure was explicit enough and was a good informal and intuitive definition. From reading the positive comments on FAC page I had the feeling I was not alone. If you have an other opinion: dont hesitate be bold! Vb 15:02, 2 November 2005 (UTC)
Old English?
Is the relation of the German word eigen to Old English really in the scope of this article? To me it's distracting and just a piece of trivia in this context. --doerfler 15:35, 10 November 2005 (UTC)
- I don't mind if you remove this. Vb 16:07, 10 November 2005 (UTC)
Linear eigenfunction operators
Hellow boys! I've studied Math physics and encountered some problems about (which appears in the chapter of Eingenfunction methods).[9] Wish someone can tell me why they are such.
Now we have a Linear eigenfunction that gives
This is first question. Second one is
where
- is weightfunction. What roles(or physical meanings) does play?
PS:I'm not quite sure what I wrote. If any mistake,correct on me!^^
Connection between spectral radius and matrix norm for normal matrices
The reasons for my reverts are as follows:
- Only square matrices have eigenvalues.
- The operator norm is not the least upper bound for the moduli of its eigenvalues; the spectral radius is the least upper bound.
- Even for normal matrices, different vector norms give different operator norms. For instance, the matrix
-
- (rotation over 45 degrees) is orthogonal, hence normal. Its spectral norm is and its maximum column sum norm is .
-- Jitse Niesen (talk) 02:13, 5 April 2006 (UTC)
I assume the maximum column sum norm is the operator norm induced by the L^1 norm on C^2, which unit vector do you then take to achieve sqrt(2)? Mct mht 02:22, 5 April 2006 (UTC)
- Shoot, you're right, sorry Jitse. Something might be wrong with your notation there tho. The operator norm induced by the L^{\infty} norm of that matrix is indeed sqrt(2). Thanks. Mct mht 02:33, 5 April 2006 (UTC)
Statement about 2-norm of a normal matrix added back. Mct mht 02:46, 5 April 2006 (UTC)
Eigenvalues & Eigenvectors of matrices
Under the eigenvalues & eigenvectors of matrices section, could the part regarding "Finding Eigenvectors" be expanded. From a limited understanding of eigenvectors, it is unclear for me how to actually find an eigenvector from an eigenvalue. Hope someone would be able to expand on this part of the article.
Normal matrix
Not a bad article, but I am surprised that Wikipedia calls it "a featured article, (...) one of the best articles produced by the Wikipedia community." Just looking at it for one minute one can spot a serious error: The statement that "a matrix is diagonalizable if and only if it is normal" is absolutely false. I did not read the rest of the article after a mistake of this caliber. Grand Vizier 20:46, 18 April 2006 (UTC)
- You're right, it should be "a matrix is unitarily diagonalizable if and only if it is normal". Thanks for mentioning this. However, I hope that in the future, you will be more constructive and correct it yourself; otherwise it might appear that you just want to show off how smart you are. -- Jitse Niesen (talk) 03:33, 19 April 2006 (UTC)
No need to feel threatened. It is wrong if it is wrong. Grand Vizier 20:51, 19 April 2006 (UTC)
I typed that sentence. I thought it should seem ovbious that unitary equivalence is meant. It would be clearly wrong otherwise, SVD says every matrix is "diagonalizable," even non square ones. Mct mht 17:12, 19 April 2006 (UTC)
- "Diagonalizable" has a specific meaning in matrix theory, which is explained in diagonalizable matrix. With this meaning, not every matrix is diagonalizable. As it says in the article, "a matrix is diagonalizable if and only if the algebraic and geometric multiplicities coincide for all its eigenvalues". -- Jitse Niesen (talk) 05:28, 20 April 2006 (UTC)
Error in the generalized eigenvalue problem
It looks like an error got inserted. Take for example matrices and , then , , and the solutions to the generalized eigenvalue problem Av = λBv are the pure imaginary eigenvalues i and − i.
Therefore, there exists a pair of real symmetric matrices such that the solution of the corresponding eigenvalue problem are not real ... the contrary is stated on the page ... and I'm about to take it out ... Actually, what is stated here is true for each of the matrices independently and for their own simple eigenvalue problem, but the generalized eigenvalue problem it is not true. tradora 00:51, 9 May 2006 (UTC)