Talk:Determinant

From Wikipedia, the free encyclopedia

Contents

[edit] Formula

It would be useful to add this formula

| A + BCD | = | A | | C | | C − 1 + DA − 1B |

which I always think of as the equivalent of the Sherman-Morrison for determinants, i.e. it allows you to update a determinant without recomputation. However, I'm not sure of it's origins (I just have it written on a scrap of paper) or how general it is, so I'd rather someone with more knowledge put it up. --Lawrennd 14:03, 30 September 2005 (UTC)

[edit] Requests

"The interpretation is that this gives the area of the parallelogram with vertices at (0,0), (a,c), (b,d), and (a + b, c + d), with a sign factor (which is −1 if A as a transformation matrix flips the unit square over)."

   Could someone draw up an example please? - Cyberman
You could draw it, upload it and add it... --Carbonrodney

Just added that picture. It's done:) Feel Free to edit mercilessly, perhaps with an animated gif. Rpchase 21:50, 7 November 2006 (UTC)

Maybe someone could add something about the computation costs of finding a determinant. I've heard of algorithms that are big O of n^d, with d<3 (d=3 for Gaussian Elimination) and specifically I've heard of a rumour that someone proved you can get d asymptotically as close to 2 as you want. Anyone know if this true? -Stephen

It should be possible to code it linear time. I will write one up after exams, maybe. --Carbonrodney


Somehow this article seems a little disorganized; maybe somebody has some idea to make it more structured. I think determinants are connected to many notions in linear algebra (invertibility, # of solutions etc), and to notions in other fields of mathematics (i.e. wronskian). Maybe somebody who is expert can add some more of these properties, or appropriate links. Also it seems there is a lot of determinant magic out there. Anton (28.02.07)

[edit] Misc

I find the entire page too in-depth. This is not necessarily a bad thing for an encyclopedia, but we should consider users who are not so math savvy, or users with just intermediate math who just came in to find out how find the determinant of a 3x3 matrix.

Finding a 3x3 determinant, non-math savvy description:

  1. multiply the numbers on the diagonals that go left to right (imagine two of the diagonal wraps around the matrix) and sum the products.
    S1 = (a1 · b2 · c3) + (a2 · b3 · c1) + (a3 · b1 · c2)
  2. multiply the numbers on the diagonals that go right to left and sum the products.
    S2 = (a3 · b2 · c1) + (a2 · b1 · c3) + (a1 · b3 · c2)
  3. the determinant is the first sum subtract the second one.
    det[]=S1-S2

Also a picture or two wouldn't hurt. Something like http://mathworld.wolfram.com/Determinant.html is a lot more pleasant looking than pure text.

Adding a easy to understand section like my description above would expand the use of this Wikipedia page to being a math reference instead of just research material.


The last phrase on this page seems pretty suspect. What exactly is a "Linear Algebraist"? A specialist in Mult-Linear algebra?

On the other hand I have never met an algebraist who "preferred" the Leibnitz formula. I suppose it might be useful to compute in certain situations but I can't imagine one claiming that one sshould just forget everything else and remember that.

Somebody (myself, if I'll win the laziness) should add something about the formal definition of determinant (an alternating function of the rows or columns etc. ...), of which its unicity and how to compute it are consequences. --Goochelaar

...and add to that the foundation of the definition, which is something to do with multilinear functions.
Also worth mentioning that historically, the concept of determinant came before the matrix.

That would certainly be very interesting. What is the history of the concept? --AxelBoldt

I'll see what I can dig up, but briefly: a determinant was originally a property of a system of equations. When the idea of putting co-efficients into a grid came up, the term "matrix" was coined to mean "mother of the determinant", as in womb.
The determinant function is defined in terms of vector spaces. It is the only function f: F^n x F^n .... x F^n -> F that is multilinear & alternating such that f( standard basis ) = 1.
Obviously, the above needs a major amount of fleshing out....


I rewrite the page in a format similar to trace of a matrix. Wshun


Text moved over from Talk:Determinant mathematics

Perhaps mention of the Scalar Triple Product, a.k.a. the Box Product, is fitting in the paragraph about the volume of the parallelopiped. If only to introduce the nomenclature.

I'm not familiar with that. Is it just the determinant of three 3-vectors? --AxelBoldt

Essentially, yes. According to Advanced Engineering Mathematics by Erwin Kreysig: "The scalar triple product or mixed triple product of three vectors

  a = [a1, a2, a3],   b = [b1, b2, b3], c = [c1, c2, c3]

is denoted by (a b c) and is defined by

(a b c) = a · (b × c)."

Since the cross product can be defined as a determinant where the first row is comprised of unit vectors, it is easy to prove that the scalar triple product is the determinant of a matrix where each row is a vector. Take its absolute value, and you get a volume. Another use of the product, besides computing volumes, is to show that three 3-d vectors are linearly independent ((a b c) ≠ 0 => a, b, c are linearly independent). From what I understand, it's a dying notation because it can be described in terms of the dot and cross products, but it still has a couple of uses.

Perhaps just include mention of it on this page, and define it on a vector calc page.


Hmmm - talk about determinants with vector entries - that really ducks what's going on, no? Which is a 2-vector (wedge of vectors) being paired with a vector. Charles Matthews

I moved this out of the page.

Here is a 2-by-3 matrix (used when taking the cross product of two vectors)

B=\begin{bmatrix}a&b&c\\d&e&f\end{bmatrix}

which has the determinant (in vector form)

det(B) = [bfce,cdaf,aebd].

This is a bit off-topic, and confusing on a page about square matrices. It really belongs with (perhaps) cross product, or introductory exterior algebra.

Charles Matthews 18:45, 21 May 2004 (UTC)

I always knew that procedure having the three basis vectors in the first column, for a, bR3 ie
B=\begin{pmatrix} \mathbf{e}_1 & \mathbf{e}_2 & \mathbf{e}_3 \\ a_1 & a_2 & a_3 \\ b_1 & b_2 & b_3 \\ \end{pmatrix}
which keeps the matrix square, and keeps the notation consistent too Dysprosia 00:31, 22 May 2004 (UTC)

But what does a determinant with vectors in it mean? This is a good mnemonic, though. Charles Matthews 08:21, 22 May 2004 (UTC)

I'm not sure that it has any other special (tensor-ish?) meaning, other than if one writes that determinant in terms of the Levi-Civita symbol having those basis vectors there help organize the components. But yes, it is a good mnemonic :) Dysprosia 00:15, 23 May 2004 (UTC)

I think this actually belongs at minor (linear algebra), as a concrete example to balance the general stuff. And this article should link there, in relation to taking determinants when the matrix is not square.

Charles Matthews 08:34, 23 May 2004 (UTC)

[edit] Alternate definition?

I think we could add the fact that the determinant is defined as it is because it is the only function

F:\{n \times n \; matrices\} \longrightarrow \mathbb{K}

with the properties:

  • it is linear w.r.t. columns;
  • whenever any two columns are exchanged, it changes its sign;
  • F(Id) = 1.

If nobody disagrees, I will add this in a couple of days. Cthulhu.mythos 15:26, 29 May 2006 (UTC)

I think that is a fine idea. This "definition" requires a theorem (such a map is unique) before it is well-defined, but it is of course much more explicit about the useful properties the determinant should have. In fact, your description is nothing more than an expression of the "best" definition in terms of columns: the determinant is the induced map of some linear transformation on the top exterior power. Exterior powers are defined in terms of a universal property of alternating maps, and the definition you cite is none other than that. Anyway, it's a good idea. Don't wait a few days, go ahead and do it now. -lethe talk + 16:53, 29 May 2006 (UTC)
Done. As soon as I manage to reconstruct the proof, I will add it too. Cthulhu.mythos 16:25, 30 May 2006 (UTC)
Including a proof here would make the page even more overlong. I put it in Leibniz formula (determinant). Cthulhu.mythos 08:55, 31 May 2006 (UTC)
Thanks, the addition looks good. I don't think a proof is too important. What we need now is an explanation and example of how to calculate a determinant by row reduction, a calculation based on the alternating-ness. It's shameful that there's only a brief passing mention of this algorithm, since that's how it's actually done in practice. Only a dummy uses expansion by minors. -lethe talk + 10:57, 31 May 2006 (UTC)

[edit] Clarification

\det(\exp(A)) = \exp(\operatorname{tr}(A)).

Doesn't this only hold when A is diagonalizable? No such limitation is mentioned. Can anyone prove this? (132.163.135.113 00:31, 12 December 2006 (UTC))

No, it always holds. This is exercise 6.2.4 in Horn and Johnson. Topics in Matrix Analysis; I'm sure it's also in many other books. The proof is via the Jordan normal form, I'd guess. Alternatively, take a look at the proof at PlanetMath. -- Jitse Niesen (talk) 06:49, 12 December 2006 (UTC)

[edit] Mistake in image

I think there is a mistake in the current version of Determinant.jpg. Please have a look at Image talk:Determinant.jpg. --Abdull 16:50, 1 January 2007 (UTC)

The same point was made by a remark left by an IP editor, who said:

"The drawing is wrong, exchange a and d for b and c. I mean, it is right, but if you don't exchange the letters, the area is negative, which is not wrong, but confusing."

-- Jitse Niesen (talk) 01:06, 23 March 2007 (UTC)