Talk:Orthogonal matrix

From Wikipedia, the free encyclopedia

WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, which collaborates on articles related to mathematics.
Mathematics rating: A-BB+ Class High Priority  Field: Applied mathematics
One of the 500 most frequently viewed mathematics articles.

Contents

[edit] Rotations and orthogonal matrices

User:Oleg Alexandrov said "I am sorry to revert again. Not every orthogonal transformation is a rotation. "

Please don't be sorry. Would you elaborate on this in the article? I was taught that an orthogonal transformation is a rotation at the origin. I suspect I am not the only one who believes this, and some clarification can help. -- Taku 04:02, August 14, 2005 (UTC)
I hope I have time to do this tomorrow. I will make sure I don't forget. Oleg Alexandrov 04:35, 14 August 2005 (UTC)
The truth is more subtle. When we use a matrix to describe a linear transformation, we implicitly accept that we are working with a vector space, rather than, say, an affine space (which does not have an origin!). That aside, the geometric content of orthogonality is that "lengths do not change". The squared length of a vector v is produced by vTv; the matrix G transforms v to G v; the transpose of G v is vTGT; thus length preservation means vTGTG v = vTv for all v. Hence the orthogonality condition, GTG = In. A rotation takes the form of a special orthogonal matrix, where "special" is a technical term meaning the determinant is +1. The determinant of any orthogonal matrix is either +1 or −1, so fully half of them do not correspond to rotations. For example, in the description of point groups for crystallography we have not only rotations, but also reflections, inversions, and rotary reflections. KSmrq 23:17, 2005 August 14 (UTC)
Well, as I mentioned below, and as written at rotation, things are not as simple. For example, the matrix with +1, -1, -1 on the diagonal is orthogonal with determinant equal to 1. Does this correspond to a true 3D rotation though? To me it looks like two reflections, and I am not sure if in this case two reflections add up to a rotation. Oleg Alexandrov 23:51, 14 August 2005 (UTC)
That is simply a rotation of 180° about the x-axis.--Patrick 01:22, 15 August 2005 (UTC)
You are both correct. The matrix with diagonal (+1,−1,−1) is a reflection across both xy and xz, and also a rotation of 180° around x, the line of intersection. A single reflection has determinant −1; the determinant of a product is the product of the determinants, so the composition of two reflections produces a matrix with determinant +1; an orthogonal matrix with determinant +1 is, by definition, a rotation. The product of two rotations has determinant +1 again, so rotations form a subgroup, SO(n), of the orthogonal group, O(n). Have a look at the mirror discussion in Euclidean plane isometry. KSmrq 15:18, 2005 August 15 (UTC)

Thanks bunch. I think this sentence is especially problematic: "Orthogonal transformations include rotations and reflections." This begs: what else the transformation include? In writing about math, we can be direct. I also realized it is ambiguous if a rotation (in the context of the article?) includes reflections. I know I wasn't careful as well. Any idea? -- Taku 11:51, August 14, 2005 (UTC)

Well, the article says somewhere below that any orthogonal matrix is obtained from combining rotations around the coordinate axes and reflections against the coordinate axes.
I think the problem is that the concept of rotation is (I think) is rather complicated beyond 3D, see the rotation article. There are true rotations, and there are so-called Clifford-style rotations. I don't know much about this subject though. Oleg Alexandrov 16:04, 14 August 2005 (UTC)
I have written out the possibilities for 1D, 2D, and 3D.--Patrick 21:38, 14 August 2005 (UTC)
Thanks a lot! Taku, any comments? Oleg Alexandrov 21:56, 14 August 2005 (UTC)
I wouldn't rely too much on the rotation article; it needs work. The standard definition of rotations in any dimension does not, and cannot, refer to an axis. Already that fails in the plane! The reference to "Clifford-style" rotations is more confusing than helpful. In 2D all rotations are planar; likewise in 3D, though we now have a choice of planes. Always we have a fixed point, perhaps no more; the fixed axis in 3D is an anomaly. Consider that in 4D we can have a "planar" rotation in the first two dimensions and a separate "planar" rotation in the last two dimensions. Only the origin is fixed. Up in 5D, we have the same thing — a "Clifford-style" rotation — but with a dimension left over. The vector in that direction gives a line of fixed points, but it doesn't tell us much about the rotation − not even which dimensions are paired. Unfortunately, 3D rotations are a very special case, and a poor hint towards the n-dimensional structure. KSmrq 16:03, 2005 August 15 (UTC)

In a hasty edit, I wrote "include rotations and reflections'. Does it include anything else? That depends on how you define "reflection", I'm afraid, and I think conventions differ. So ideally, something should be added to the article explaining that conventions differ, and saying which one is followed in this article. Michael Hardy 01:25, 15 August 2005 (UTC)

The mathematics is clear and simple, the naming less so. The article already mentions that we can always find a choice of axes so that the matrix consists of 2×2 rotation blocks and isolated ±1 entries. If we group the −1 entries into pairs, those are also rotation blocks, as are (trivially) +1 pairs. What can be left is only a single +1, a single −1, or one of each. Therefore we can decompose any orthogonal transform as some number of planar rotations in mutually orthogonal subspaces, and perhaps a single reflection orthogonal to all of them. As for the names, in the 3D conventions of crystallography the possibilities are called identity (+1,+1,+1), rotation (R,+1), reflection (+1,+1,−1), rotary reflection (R,−1), and inversion (−1,−1,−1). Rotary reflections go by an assortment of names. KSmrq 15:45, 2005 August 15 (UTC)
KSmrq, you are very right! Taku was also right, and I was wrong to say that not every orthogonal matrix of determinant 1 is a rotation.
I have a request. You wrote a very good paragraph above. The facts might be present in the text too, but I doubt that they are easy to find. This because this article is getting more and more complicated as one reads down. So, my request is the following. Do you think it would be possible to add the information you wrote above to the section ==Geometric interpretations== and maybe move that section above the section ==Properties==? This so that a quick reader who does not know much math does not have to read through all those complicated facts and formulas in the ==Properties== section before getting the essence of what orthognal matrices are all about. What do you think? Oleg Alexandrov 16:20, 15 August 2005 (UTC)

[edit] Matrix representation of Clifford algebras

I propose to delete the entire section "Matrix representation of Clifford algebras". Although I have a fondness for the material, having written about it myself elsewhere, I believe it is entirely out of place in this article. It tells us nothing helpful about orthogonal matrices, and to construct the full Clifford algebra we must abandon orthogonality anyway.

What might be more helpful is to describe how to use "complex structures" in an article on Spin groups. For example, the orthogonal matrix

J = \begin{bmatrix}0&-1\\1&0\end{bmatrix}

squares to minus the identity, the identity being, of course,

I = \begin{bmatrix}1&0\\0&1\end{bmatrix}

Taking linear combinations of I and J, and their multiplication table (especially J 2=−I), we can construct a Clifford algebra equivalent to the complex numbers. Within this we have the complex numbers of unit modulus, which act as rotations by left multiplication. The complex conjugate of aI+bJ is its transpose. This is a somewhat misleading special case. So is my next example.

Complex structures only exist in even dimensions, so we next look at 4×4 matrices and, along with I4, take

X = \begin{bmatrix}0&-J\\-J&0\end{bmatrix}
Y = \begin{bmatrix}0&I\\-I&0\end{bmatrix}
Z = \begin{bmatrix}J&0\\0&-J\end{bmatrix}

Each of these squares to −I4, and their products anticommute (X Y = −Y X). With these four matrices we can construct a Clifford algebra equivalent to the quaternions. Using unit norm quaternions for similarity transforms, we produce the Spin group covering 3D rotations. Of course, if we write the X, Y, and Z matrices as complex matrices we see Pauli spin matrices peeking out.

The general case uses a sufficient number of independent anticommuting matrices, each squaring to minus the identity, to construct a Clifford algebra of any dimension we like. We form the linear combinations with unit norm, which sandwich to act as reflections. Then taking pairs of reflections (the even subgroup) we form rotations, and the general Spin groups.

Fun stuff, but I just don't think it belongs in this article. So if I hear no objections soon, this section disappears. --KSmrq 15:45, 2005 August 16 (UTC)

I hate to see the stuff gone as well. So can we move it somewhere else? Clifford algebras, Representations_of_Clifford_algebras or a new article Matrix representation of Clifford algebras or something. -- Taku 21:58, August 16, 2005 (UTC)

[edit] Rewritten

In bold Wikipedia fashion, I have completely rewritten the article. This includes reorganizing, removing, revising, and writing a great deal of new material. If all has gone well, it will be more helpful both for numerical linear algebra and for mathematical theory. I have left a little hint at the bottom that could be filled with the Clifford algebra construction described on this talk page, but was reluctant to make the article longer. Enjoy, and happy editing. --KSmrqT 04:43, 15 October 2005 (UTC)

[edit] Special characters

A seriously misguided edit bracketed instances of U+2008 (&puncsp; = [PUNCTUATION SPACE]) with the deprecated HTML markup <font face>. This is a Really Bad Thing, for several reasons explained below. The intent of my use of &puncsp; (as a UTF-8 character, not a character entity) was to work around current limitations in display of mathematics, to prevent bad line breaks in the middle of formulae yet not clutter the edit page with ugly &nbsp; entities everywhere. So a first Really Bad Thing about the bracketing is that the net effect is to make editing far worse than with the original &nbsp; clutter.

I do not like it either, it is a last resort; in this case it was experimental, an alternative for reverting your edit.--Patrick 21:18, 17 October 2005 (UTC)

A second Really Bad Thing is that the <font> tag and the use of <font face> are deprecated [1] in HTML, for the very good reason that they should be replaced by use of CSS styling. (The HTML 4.01 Spec dates from 1999, so this is hardly news.) Deprecated tags have been removed from XHTML, and XHTML is necessary if Wikipedia is to move forward to MathML.

Ok, <span style="font-family: arial">a b</span> giving a b can be used.--Patrick 21:23, 17 October 2005 (UTC)

A third Really Bad Thing is that it defaces the edit page principally for the benefit of some users of Microsoft's seriously deficient Internet Explorer browser.

You are not writing for yourself but for your readers. IE is very popular.--Patrick 21:23, 17 October 2005 (UTC)

A fourth Really Bad Thing is the explicit specification of a single font exclusive to Microsoft, "Arial Unicode MS", which is not univerally available and may not fit with a user's other font choices.

Ok, Arial is better, see above.--Patrick 21:25, 17 October 2005 (UTC)

Therefore, I have reverted the edit and replaced every instance of &puncsp; with the &nbsp; character entity. If possible it would be nice to replace this with the UTF-8 character, but when I try to paste that literally it becomes an ordinary space. --KSmrqT 11:13, 17 October 2005 (UTC)

Very much agree with KSmrq. That MS fontcruft everywhere sucks. One should not forget that this page is meant to be edited by people, so ugly html tags should be kept to the minimum. --Oleg.
The main thing is that the rendered page is not a mess, especially not with a popular browser like IE. The second important thing is that the wikitext is kept simple, so I am pleased having &nbsp; back.--Patrick 13:55, 17 October 2005 (UTC)
Your substitution suggests you have Arial Unicode MS, now officially available only as part of the Microsoft Office suite. However, I believe the character exists in plain Arial, so this sounds like a browser problem. Once again, quality sinks to the lowest common denominator. In future, however, I promise you will see more and more Unicode characters because of Wikipedia's adoption of UTF-8, so you might want to do yourself a favor and begin to cope now with the MathML characters on this page. (Sadly, the arrival of STIX fonts [2] has been pushed to mid-2006; but that's still Real Soon Now.) Notice that already Wikipedia's Unicode page tells users (in a note at the bottom) that if they can't see the characters, it's their problem, and suggests switching browsers (gasp!). --KSmrqT 15:51, 17 October 2005 (UTC)
That note on the Unicode page is not about Wikipedia policy. If you want to use a special character, please check first whether it works in IE. Consider using Latex, or if the symbol is not available, upload an image.--Patrick 21:38, 17 October 2005 (UTC)
Currently there is still a problem on the page: I need
  • <span style="font-family: 'Arial Unicode MS'">a ≅ b ⋉ c</span>
to get a ≅ b ⋉ c
Here just Arial does not work, so this may not work for all IE users. Also there is the problem of not displaying in the edit box.--Patrick 22:18, 17 October 2005 (UTC)

I guess one better use LaTeX as PNG images rather than verbose Arial Unicode MS font tags. Oleg Alexandrov (talk) 09:09, 18 October 2005 (UTC)

Done, except that Image:Rtimes2.png we do not seem to have in LateX.--Patrick 10:58, 18 October 2005 (UTC)
Reverted, because \ltimes (correct, [LEFT NORMAL FACTOR SEMIDIRECT PRODUCT]) is not \rtimes, and because it's inappropriate to foist your limitations on the world. Let's go clarify a character policy at WikiProject Mathematics. --KSmrqT 12:04, 18 October 2005 (UTC)

[edit] Orthogonal matrices over fields other than R

I presume the definition of an orthogonal matrix holds for fields other than R but it would be nice if the article stated so. The article orthogonal group uses the more general definition over any field however it does not explicitly state the form of its elements but rather references this page. Since orthogonal groups over finite fields are important in group theory it may be useful to generalize the definition given here. Since I haven't worked too much with orthogonal groups I thought it would be best to let someone verify first that the definition does in fact generalize. TooMuchMath 02:55, 14 April 2006 (UTC)

Well, orthogonal matrices come from isometries, so you need some kind of distance to talk about such matrices. One can talk about orthogonal matrices over complex numbers, where instead of transpose one uses the conjugate transpose.
The definition
Q^T Q = Q Q^T = I\,\!
does generalize to matrices over any ring, but I don't know how useful it is. Oleg Alexandrov (talk) 04:04, 14 April 2006 (UTC)
I'm mostly interested in groups of Lie type, IE the case where the field is finite of order p^d. Unfortunately I've been unable to find a source that confirms the defining relation is the same as for the real case.
Already the beginning of this article states "Although we consider only real matrices here, the definition can be used for matrices with entries from any field." It would not be a service to most readers if we encumber this (already long) article with explorations of finite fields. Perhaps such material, if needed, would be more at home in the orthogonal group article, since that is necessarily more abstract and algebraic, and where you originally looked. --KSmrqT 10:20, 14 April 2006 (UTC)
Oh right you are! Apparently this is the definition that I need and I'll include the appropriate generalization of the definition on the orthogonal group page. Hopefully I'll be able to enlist some support to come up with some examples over a finite field there. Thanks. TooMuchMath 17:26, 15 April 2006 (UTC)

[edit] Orthonormal matrix

Section moved from Talk:orthonormal matrix -- Jitse Niesen (talk) 14:55, 2 June 2006 (UTC)

I do not think the term orthonormal matrix is standard anywhere. Perhaps someone created this page in an effort to popularize the term. Michael Hardy 03:18 Mar 22, 2003 (UTC)

It is indeed nonstandard, so I merged the article orthogonal matrix with this article. However, I do seem to remember having seen it used somewhere (hmm, that's a contorted verbal construction). -- Jitse Niesen (talk) 14:55, 2 June 2006 (UTC)

I disagree. In my current courses, Continuum Mechanics and Fluid Mechanics at the University of Minnesota the term orthonormal is used quite often and found in our textbook and in a few research papers. In particular "Introduction to the Mechanics of a Continuous Medium" by Lawrence E. Malvern. It is also noted on http://mathworld.wolfram.com/OrthonormalBasis.html I've even had one assignment requesting I find orthonormal eigen vectors. The manner in which it is mentioned in my courses implies it is standard in the field of Aerospace Engineering. -- Kruzicka (talk) 16:55, 28 September 2006 (UTC)

The term "orthonormal" is completely standard in the phrase "orthonormal basis". And, confusingly, the columns of an "orthogonal matrix" do comprise an orthonormal basis. Despite this overlap, the phrase "orthonormal matrix" is at odds with standard mathematical practice, including applied mathematics and physics. To quote from the article,
  • A real square matrix is orthogonal if and only if its columns form an orthonormal basis of the Euclidean space Rn with the ordinary Euclidean dot product, which is the case if and only if its rows form an orthonormal basis of Rn. It might be tempting to suppose a matrix with orthogonal (not orthonormal) columns would be called an orthogonal matrix, but such matrices have no special interest and no special name; they only satisfy MTM = D, with D a diagonal matrix.
Since you misinterpret the content of the MathWorld site, which I can see, I'm inclined to suspect you make the same mistake with the Malvern text, which I cannot see. --KSmrqT 22:53, 28 September 2006 (UTC)

[edit] comment re a recent minor edit

as the edit summary shows, i wasn't as polite as can be. i added a simple e.g. for the claim that SO(n) is not simply connected in general. it's a simple and relevant statement. first it got reverted (since it wasn't quite correct), but one can simply correct it rather than unnecessarily revert. then the correction got reverted, with the given reason being it's a "distracting detail, with sloppy italics." if someone is bent on removing what was added, fine, as what's relevant and what's not is a subjective judgement. i was pissed, and hence the tone in edit summary, because the reverts didn't seem to be of constructive nature. Mct mht 21:15, 18 June 2006 (UTC)

To be honest, I reverted the "projective plane" version for two reasons, only one of which I stated in my edit summary. I felt obliged to state the second reason when I removed the corrected version. And I notice that, despite my explicit mention of improper italics, the text I removed was reinserted with the same annoying sloppiness. We do not say SO(3), italicizing the parentheses and the 3, we say SO(3). That kind of edit raises doubts about the editor.
Setting questions of form aside, let's discuss substance. The section "Spin and Pin" lives in an article devoted to orthogonal matrices, not orthogonal groups. So we should not expect readers to have a clue about the meaning of the sentence in question,
How on earth does that help, for those who do not know the meaning of "homeomorphic" (not linked, by the way), nor the fact that the fundamental group of RP3 is Z2? As stated in my edit summary, this sentence provides a distracting detail which is unhelpful to the thrust of the paragraph. Unfortunately, most of the paragraph will be gibberish anyway to the majority of readers who will otherwise benefit from the article, because they also will not know the terms simply connected, covering, nor spin group. But let's not try to explain one mystery by means of another.
We have a dedicated article on SO(3), which is a fine place to include such details; we also have an orthogonal group article, another likely target. Both articles need work, and would be better served by the attention being diverted by this unhelpful detail in an article that doesn't need it.
In fact, this article might be improved by removing this entire section, since it really has little to do with orthogonal matrices. Really the only justification is the last sentence, mentioning the use of orthogonal matrices as a basis for Clifford algebra. That may sound harsh, but I know the editor who wrote this material won't be offended. (Check the edit history!)  ;-D --KSmrqT 23:22, 18 June 2006 (UTC)
removing edits while citing improper italicization as a reason, i am sorry, is pure petty bs. but otherwise that's a very detailed response, fair enough. that section was already there, i just thought to add a simple example to illustrate what was claimed, is all. Mct mht 23:38, 18 June 2006 (UTC)
Just to be clear, my edit summary said (verbatim): "remove corrected (except sloppy italics), but distracting detail". I'll leave it to others to decide if they think that is "citing improper italicization as a reason"; personally I think that view is a distortion, and in any case is not what I meant. What is clear is that I raised a concern about italics that was ignored until Michael Hardy asserted the same point in his edit.
It's a fact of life that edit summaries are terse. It's a fact of life that sometimes that leads to misunderstandings. It's a fact of life that editors edit mercilessly, as Wikipedia warns us all. It's a fact of life that toes get stepped on in the process. So we use talk pages to sort things out. Isn't it remarkable how much talk can arise from a single sentence?  :-)
Can we now focus on content? I still feel the added sentence is an unhelpful distraction, much as I love SO(3) and the Dirac belt trick (not currently found in Wikipedia). Apropos, I quote
  • For example, “Murder your DARLINGS.” This common admonition to writers (suggesting that they excise the parts of their work that most delight them) is widely misattributed to the likes of Samuel Johnson, Oscar Wilde, George Orwell, F. Scott Fitzgerald, Dorothy Parker, and William Faulkner. Its actual author was Sir Arthur Quiller-Couch, who wrote in The Art of Writing (1916), “Whenever you feel an impulse to perpetrate a piece of exceptionally fine writing, obey it — whole-heartedly — and delete it before sending your manuscript to press. Murder your darlings.”
Nevermind who first said it, many great writers support it. So do I.
If the sentence remains, I will amend it to work better, perhaps as follows.
  • "For instance, the group SO(3) is not simply connected because, famously, a double loop shrinks to a point while a single loop does not."
Otherwise, do we remove the sentence alone, or the entire section? --KSmrqT 09:47, 19 June 2006 (UTC)

KSmrg: probably a very late comment, but i appreciate that you explained at length. sry i misunderstood your intentions and got pissed off. Mct mht 07:22, 29 June 2006 (UTC)

Thanks. You can probably tell I enjoy writing, including writing about writing. (And now, writing about writing about writing!) One challenge in writing is to be concise, but that often requires considerable work "behind the scenes", so writing about the writing can get lengthy. I do like what Joseph Pulitzer (of the Pulitzer Prize) said: "Put it before them briefly so they will read it, clearly so they will appreciate it, picturesquely so they will remember it and, above all, accurately so they will be guided by its light." --KSmrqT 08:32, 29 June 2006 (UTC)

[edit] revert poorly worded statement

I was trying to write an introduction that serves to orient the reader as to the practical applications of orthogonal groups, rather than give a mathematical description immediately. Also included was an immediate link to the orthogonal group article. This link should not be buried in the article somewhere. Also, in what way was the statement poorly worded? PAR 16:29, 6 January 2007 (UTC)

I'm not sure why a link to the orthogonal group article should be included. As you wrote it, you assumed that the reader would know what an orthogonal group is, and perhaps also a representation. These are more advanced concepts than orthogonal matrix, in my opinion. On the other hand, I think it's good to make the connection with rotations in the lead.
"Poorly worded" may refer to the following:
  • The set of orthogonal matrices form … should read The set of orthogonal matrices forms …
  • It's not clear whether rotations in 2d or 3d are "most commonly" encountered.
  • Improper rotations are not in SO(3).
Of course, I can't speak for KSmrq. -- Jitse Niesen (talk) 17:58, 6 January 2007 (UTC)
Oh, right, that was a typo. How about:

The set of orthogonal matrices form a representation of the associated orthogonal group. The most commonly encountered orthogonal groups are O(2) and O(3) which correspond to the set of proper rotations and improper rotations in physical 2-space and 3-space.

I'm trying to give an intuitive understanding of the use of orthogonal matrices by showing how they may be used to calculate physical rotations. This would be very helpful for someone just coming into the subject. The present introduction is not as useful to such a person. Please help me to state this in the correct way. PAR 20:37, 6 January 2007 (UTC)
No, such a statement has no place here. In fact, you seem to have ignored almost everything Jitse Niesen told you, all points with which I agree. You wish to focus on your limited interest in orthogonal matrices; this article has a much broader scope. I suggest you look at the rotation matrix article, which may be more to your taste. I also suggest you read the "Overview" section of this orthogonal matrix article, where such things as rotations and orthogonal groups are already explicitly mentioned, along with other equally important connections. --KSmrqT 08:09, 7 January 2007 (UTC)
KSmrq - Look, its true I have not edited this page before, and I come here not knowing what kind of destructive morons you have had to deal with in the past, but please give me the benefit of the doubt and don't assume that I am one of them, ok? Three points:
  • Yes, I again forgot to put in "forms" instead of "form". Sloppy on my part, I apologize. I understand now what you meant by "poorly worded statement".
  • You seem to have ignored what I said - The introduction should "introduce" orthogonal matrices to the general reader, not to the mathematician. This is not the same as focusing on my "narrow interest".
  • Why is the "rotation matrix" article more to my taste? What did I write that seemed to indicate I was specifically interested in this subset of orthogonal matrices? Or did you again ignore what I wrote?
I'm not a mathematician, I'm a physicist, and I need to understand some points about orthogonal matrices, etc. I'd like to add to the article if I feel its lacking, but not without consensus. I'm willing to be corrected on mathematical points, but I'm not willing to be treated like a run-of-the-mill destructive idiot. Lighten up, ok? PAR 16:01, 7 January 2007 (UTC)
For reference, your original edit added the following before the table of contents:
And here on the talk page you say:
  • “I'm trying to give an intuitive understanding of the use of orthogonal matrices by showing how they may be used to calculate physical rotations.”
I respond to what I see, and what I have seen so far is troubling, many times over.
The article gives a one-sentence definition, then pauses for a table of contents, then gives an overview. In the few paragraphs of the overview we already connect orthogonal matrices to both rotations and orthogonal groups. We also connect to other topics of importance, including applications in numerical computations. In both structure and content the article could hardly be more simple and clear about this. Yet you propose an edit that seems oblivious to the design, and which elevates one connection above all others.
Then here on the talk page Jitse Niesen enumerates several concerns, including grammar. Again, you seem oblivious, and neither correct the grammar nor take to heart the possibility that your idea of what's "most commonly encountered" may not be a universal experience.
Immediately after the overview the article enumerates five examples. The second example is a 2D rotation matrix, again making the connection. But the remaining examples are also important, to others if not to you.
Next comes the first major section of the body, which immediately gives a general description of a 2D rotation matrix. The article procedes step by step through a graduated sequence of topics, eventually covering wides swaths of both applications and pure mathematics. One such topic is the spectral decomposition, which I happen to think is much more commonly encountered than mere 2D and 3D rotations, and not just in physics, but in many areas of science and mathematics. And the section just before that covers in depth the connection to orthogonal groups, and to special orthogonal groups, and to the Lie algebra they share. Given the necessarily advanced nature of the mathematics involved, we could hardly reach this topic sooner.
On a purely mathematical point, O(n) is more than the set of n×n orthogonal matrices, we need their multiplication as well. Your sentence seems to consider all n simultaneously, which is not a good idea. As already mentioned, it was wrong to assert that SO(3) includes improper rotations. And when you say “form[s] a representation of”, you get mathematicians twitching in the wrong way; to quote the first sentence of orthogonal group:
Compare this with the definition from group representation:
So your statement introduces confusion about "is" versus "represents"; not helpful.
Now you complain:
  • “Why is the "rotation matrix" article more to my taste? What did I write that seemed to indicate I was specifically interested in this subset of orthogonal matrices? Or did you again ignore what I wrote?”
Since you are the one who brought up SO(3) and “physical rotations”, I am, in fact, paying attention to what you wrote.
When I see this many issues, and of such a basic nature, this quickly, what should I think? I have now responded at more length detailing some of my impressions. They are not favorable, and I believe they justify my prior terse response. --KSmrqT 05:16, 8 January 2007 (UTC)

Not if that comment implied malicious intent on my part. Ok, I'm done being aggravated about this. Suffice it to say I've edited enough articles in fields where I am a bit more competent to appreciate the distaste you have for people with limited knowledge storming in to mess up a good article. Trust me, I'm not one of them. Which brings me to another couple of points:

  • You referred me to the rotation matrix article, which I recently modified. I can't imagine that I did it to your exacting standards, so would you take a look at it and correct any errors, or at least make some suggestions. I'm on a minor learn-and-edit voyage here and would appreciate any help.
  • Regarding the word "represents" - being a physicist, I have learned, for example, that a vector is "represented" by a column matrix. In other words, the matrix is associated with the vector only as long as a coordinate system has been defined, and the association is such that the projections of the vector on the coordinate vectors yield the matrix. (Thats a rough description). A rotation of the coordinate system yields a different matrix representation of the same vector. This was the basis for my saying that the matrices (and yes, matrix multiplication, etc.) form a "representation" of the orthogonal group. Do you have any comments about this point of view?

PAR 06:55, 8 January 2007 (UTC)

Thanks for understanding. No, I never inferred, nor meant to suggest, malicious intent.
For personal reasons I have so far abstained from getting entangled with rotation articles, except (rarely) on a talk page. I think it would drive me nuts to see all the crap that would be inevitable in the Wikipedia environment, and I know too much about rotations not to see every last tiny annoyance. So I reserve that topic for writing elsewhere. But since you ask, I'll try to take a look.
The word "represents" in a map-versus-territory context is something that is good to emphasize. (The article carefully uses language like, "such-and-such a matrix can be interpreted as a rotation.") For example, we can represent a 3D rotation using a 3×3 real matrix, or a nonzero quaternion, or an angular velocity vector, or a 2×2 complex Pauli spin matrix, or a triple of Euler angles. But when you combine in one sentence the terms "group", "representation", and "matrix", you evoke a group representation for an important segment of your audience.
Incidentally, we have tried to quarantine various abstract higher mathematics connections in the orthogonal group article. If you compare to this one, you should see a noticeable difference in topics and emphasis. For example, this article includes applications in numerical linear algebra, while that article includes a connection with Galois cohomology. That one is not in good shape, but I don't anticipate working on it in the near future. --KSmrqT 08:10, 8 January 2007 (UTC)
Ok, I will have to read the orthogonal matrix article more closely. Two questions:
  • Would you say that the "rotation matrix" article should essentially be equivalent to a "special orthogonal matrix" article?
  • Based on your above statements, isn't there a problem with the opening sentence of the orthogonal group article which states:
"The orthogonal group of degree n over a field F ... is the group of n-by-n orthogonal matrices with entries from F, with the group operation that of matrix multiplication."
Its the "is" versus "represents" problem, still not clear in my mind. Rather than "is", don't the matrices "represent" the group (using the sense of represent that you used when you said "For example, we can represent a 3D rotation...". PAR 20:54, 8 January 2007 (UTC)
Let's discuss "rotation matrix" on its talk page.
The definition of orthogonal group is the correct standard definition. We can talk about isomorphic groups, but in this case the matrix version is the archetype. By contrast, a rotation originally is a thing from geometry (or the physical world), not a collection of numbers. A 3×3 rotation matrix is one of many ways to represent and manipulate that geometric thing. --KSmrqT 23:19, 8 January 2007 (UTC)
Ok, I will be editing that article and a number of related articles such as rotation group, rigid body, angular velocity, angular momentum, moment of inertia, etc. The edits will sometimes no doubt be mathematically not to your liking, but I think they will nevertheless be an improvement. I would like them to be even better. Any edit or talk page suggestions would be appreciated. PAR 01:57, 10 January 2007 (UTC)

[edit] Random orthogonal matrix generation

A comment on using QR to generate a random orthogonal matrix(uniformly distributed)

Unfortunately, the triangular matrix R in A=QR QR factorisation does not always have positive diagonal elements(it is not a group factorisation). Try for example generating several 2x2 matrices with unit normal elements, and plotting the first column of their Qs. Only half or the circle will get filled! This can be corrected by creating a diagonal matrix S of the signs of the diagonal of R to get A=QSSR=(QS)(SR), ( since SS=I ) and using QS as the random orthogonal matrix. This may not be as fast as Stewart's method but is very easy to code in Scilab or Matlab Paul Earwicker 10:09, 13 April 2007 (UTC)

I assume you have not read the Stewart paper. As this article says, he replaces the QR idea by a slightly different approach. A typical way to zero the subdiagonal part of a column in a QR algorithm is to use a Householder reflection,
 I - \frac{1}{\bold{x}^T \bold{x}} \bold{x}\bold{x}^T . \,\!
Stewart entirely dispenses with the R part, and instead accumulates reflections with a series of vectors, xk,
 \bold{x}_k = (0,\ldots,0,\underbrace{\ast,\ldots,\ast}_{k})^T , \,\!
for k = 2,…n. He ensures that each normalized xk is uniformly distributed on a unit (k−1)-sphere, Sk−1 by using a standard trick: a normal distribution is both separable and radially symmetric, so it suffices to choose each nonzero component of xk normally distributed.
The mathematical burden in using this efficient method is to know that a Haar distribution is the goal, and to show that this method correctly generates such a distribution. But, again as mentioned in the article, Stewart's method is a special case of the subgroup algorithm, thus the proof is not hard. So, not to worry. --KSmrqT 11:21, 13 April 2007 (UTC)

[edit] "Rectangular matrices" section

That section seems to actually describe things for NON-SQUARE matrices. A square is also a rectangle. And for a square matrix, when the rows are orthogonal, so are the cols? —Preceding unsigned comment added by 87.174.94.96 (talk) 14:50, 23 March 2008 (UTC)