Talk:Jacobian

From Wikipedia, the free encyclopedia

WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, which collaborates on articles related to mathematics.
Mathematics rating: Start Class Mid Priority  Field: Analysis
One of the 500 most frequently viewed mathematics articles.

Contents

[edit] Differentiable versus partial derivatives

I have serious doubts as to the validity of this entire section. From "Calculus on Manifolds" by Spivak, a function is defined to be differentiable at a point if it has a linear approximation there, and further this linear approximation is unique.

Further, the example used does not give the correct definition for the directional derivative, which is lim_ h->0 |f(a+h)-f(a)|/h. And though the definition given will work, the answer derived is incorrect, because there is confusion about which coordinate system (x.y)/(r,theta) is being used here.

The function is defined in (r,theta) space, but limit is being taken in (x,y) space, tending towards (0,0) along the (1,1) line. This corresponds to traversing the function in (r,theta) space in a line tending towards (0,pi/4), not towards (0,0) as requested.

In (x,y) space, the function is defined as xy/sqrt(x^2+y^2), and so neither it nor its jacobian are defined at (x,y)=(0,0). The function is defined at (r,theta)=(0,0), and so is it's jacobian = (0,0). A taylor series will show this very easily.

For the above reasons, I am removing this section. ObsessiveMathsFreak (talk) 14:36, 11 February 2008 (UTC)

[edit] Notations, Determinants

Hello. About notation, I seem to recall the notation DF for the Jacobian of F. Have others seen that notation? -- On a different note, maybe we can mention that if F is a scalar field then the gradient of F is its Jacobian. Happy editing, Wile E. Heresiarch 21:28, 23 Mar 2004 (UTC)


Isnt' there an error in the simplification of the determinant given as example? Where is the term x3 cos(x1) gone? -- looxix 00:32 Mar 24, 2003 (UTC)

I've always heard the initial consonant pronounced as a affricate, as in "John". Michael Hardy 01:23 Mar 24, 2003 (UTC)

So have I, and it's listed as the first of two possible pronunciations at Merriam Webster. I've added it to the article. Pmdboi 03:36, 5 March 2006 (UTC)
Since Jacobi is of German origin I guess his name and its derivatives should be pronounced only in the second way. English language rules can not be applied to words of foreign origin.

I originally put in the matrix here, and put in most of the structure. I did make a mistake in terminology, thou, as i see has been corrected. I defined the jacobian matrix, where the "Jacobian" per say, refers to the determinant of that matrix. My point is is that this page was originally designed to define the jacobian matrix, and i see that that definition is a stub. I have a copy of the page before it was fixed. i'm posting it in the stub for jacobian matrix. I think, then, it would be a good idea to discuss whether we might want to combine the two into one page? I'm for this. I think the ideas neeed to be presented closely together in order for fluent comprehension, and a brief and clear page describing first the jacobian matrix, and then the jacobian, would be simple to construct as well as being a better way to present the topic. Kevin Baas 2003.03.26

Why in the world do you call

(x_1,\dots,x_n)

a "basis" of an n-space? A basis would be something like this:

\{\,(1,0,0,0,\dots,0),\,(0,1,0,0,\dots,0),\dots,\,(0,0,0,0,\dots,0,1)\,\}.

(By the way, the Latin phrase "per se" doesn't have an "a" or a "y" in it.) Michael Hardy 20:36 Mar 26, 2003 (UTC)

---

From my understanding of basis, it is the selection of a set of measurements from which one defines a coordinate system to describe a space. Thus, the unit vectors:

\{\,(1,0,0,0,\dots,0),\,(0,1,0,0,\dots,0),\dots,\,(0,0,0,0,\dots,0,1)\,\}.

would be defined by way of the basis. That is, one could pick an entirely different system of measurements; entirely unrelated "unit"s of measurement, and have a different 'basis' from which to define 'distances' in a space, which would be equally valid, although (1,0,0) in one system would not be the same as (1,0,0) in another system.

Thus, when it is said that (x_1,\dots,x_n) is a basis, i interpret this as saying that x1, ect. Is the system of normalized variables used to measure the space. One could have just as easily (and may find it usefull for other purposes) defined a topologically equavelent space with a different 'basis', orthogonal to this one.

However, this is merely a very fuzzy intuitive interpretation, and I'm not justifying the use. I am explaining what i think was the intention. -Kevin Baas

let me further add, that i think, thou my memory is shaky here, that a basis is a set of vectors. That is, they can only be something like: {(4,3,0), (5,0,4), (1,3,2)} such that {(4x,3x,0x), (5y,0y,4y), (1z,3z,2z)} are linearly independant. Thus, they depend on a pre-established system of variables, and are based off of the eigenvalues of that system. -Kevin Baas


Would it be correct to say f is conformal iff Jf(x) | Jf(x) | − 1 / n is orthogonal (where n is the dimension)? 142.177.126.230 23:49, 4 Aug 2004 (UTC)


[edit] i'm confused


\left(\frac{f}{y}\right)_g=\frac{\partial{(f,g)}}{\partial{(y,g)}}

Yes when i wrote jacobian and determine it, i get 0, zero, null, notting!?! Am i right?
thankx
| mail me

[edit] Tensor Product?

Isn't the Jacobain matrix just the tensor product of the grad operator? That is, is this correct?:

J_{ij}(\mathbf{f}(\mathbf{x})) = \mathbf{J}(\mathbf{f}(\mathbf{x})) = \mathbf{f}(\mathbf{x})\otimes\nabla = \frac{\partial f_i(\mathbf{x})}{\partial x_j}

—Ben FrantzDale 16:57, 5 May 2006 (UTC)

[edit] Orientation

Furthermore, if the Jacobian determinant at p is positive, then F preserves orientation near p;

Orientation as in Orientability or Orientation (mathematics)? --Abdull 12:56, 25 May 2006 (UTC)

As in both of those; they're the same concept. —Keenan Pepper 18:45, 25 May 2006 (UTC)

[edit] Jacobians and gradients. consistent definitions please!

In this wiki the gradient of a scalar function f(x) wrt a vector x is defined as a column vector:


\partial f/\partial x=\begin{bmatrix}\partial f/\partial x_1& \cdots &\partial f/\partial x_n\end{bmatrix}^\top
.

The Jacobian matrix is defined as in this page I'm commenting now:


\partial f/\partial x=\begin{bmatrix}\partial f_1/\partial x_1 &\cdots& \partial f_1/\partial x_n\\
\vdots & &\\
\partial f_m/\partial x_1 &\cdots &\partial f_m/\partial x_n\end{bmatrix}

Later in the same Jacobian matrix entry it is said that the rows of the Jacobian matrix are the gradients of each component of f wrt x. This is obvoiusly impossible!

I propose to give a double definition, with a clear notation describing the adopted convention, as follows :

a)  \partial f^\top/\partial x=\begin{bmatrix}\partial f_1/\partial x_1 &\cdots& \partial f_m/\partial x_1\\
\vdots\\
d\partial f_1/\partial x_n &\cdots &\partial f_m/\partial x_n\end{bmatrix}

b) \partial f/\partial x^\top=\begin{bmatrix}\partial f_1/\partial x_1 &\cdots& \partial f_1/\partial x_n\\
\vdots & &\\ 
\partial f_m/\partial x_1 &\cdots& \partial f_m/\partial x_n\end{bmatrix}

and similarly for the gradient as df/dx and df/dx'. This way everything fits. Note that the orientation of the variable wrt which we differentiate drives the orientation of the output gradient vectors. For the Jacobians, both the differentiated vector function f and the differentiator vector x drive the distribution of the output matrix elements. This way, we could also define the two following (quite useless I admit) constructions:

c) \partial f/\partial x = a column-stacked vector of all column gradients of the components of f.

d) \partial f^\top/\partial x^\top = a row-stacked vector of all row gradients of the components of f.

Personally I find form b) as being much more useful, as when we pas to the partial derivatives of matrix expressions we can almost mimic the scalar differentiation rules. Using a) in this context leads to incredibly confusing expressions, full of transposed marks.

Joan Sola LAAS-CNRS Toulouse France 82.216.60.51 11:37, 17 August 2006 (UTC)

What is "obviously impossible"? The first row of the Jacobian matrix is (df1/dx1 ... df1/dxn), which is the gradient of the first component of the function f. The Jacobian matrix, as defined on this page, is form b) in your list, which seems to be precisely what you want. -- Jitse Niesen (talk) 11:07, 17 August 2006 (UTC)
Excuse me I submitted with some errors. Reread what I posted now and you will see what I mean. Thanks82.216.60.51 11:39, 17 August 2006 (UTC)
I'd like to insist on this topic. I corrected the first line of my comment, that originally said gradients were defined as row-vectors thus making my comment absurd. Gradient is defined as a column-vector in Wikipedia. So either we leave the vectors and matrices orientations out of the definition (thus giving this freedom to the user), either we use consistent definitions. I'm for this second alternative but I am not matematician so I can't have a strong position on this. --Note: I've just made up an account so I sign now as Joan Solà 15:20, 21 August 2006 (UTC) though I'm the same who started this topic. You can email me if you wish. Cheers. -- Joan Solà 15:20, 21 August 2006 (UTC)
Indeed, the Wikipedia article gradient defines it as a column vector. That's the most common definition, I think, though it often makes more sense to define it as a row vector. Anyway, I changed the text to remove the contradiction; it now says that each row of the Jacobian matrix is the transpose of the gradient.
Your proposal with  \partial f^\top/\partial x etc. is cute, but it is not used as far as I know. According to our no-original-research policy, we cannot make up new definitions but have to stick with those already in use. Thanks for your comments. -- Jitse Niesen (talk) 07:58, 22 August 2006 (UTC)
Allright I agree with the solution. Thks Joan Solà 12:56, 23 August 2006 (UTC)

[edit] Vanishing Jacobians

I was looking for information about Vanishing Jacobians on Google, and I saw this article. This article talks nothing about Vanishing. Maybe somebody should add something about Vanishing Jacobians?

James 03:25, 25 June 2007 (UTC)

[edit] In dynamical systems, Stationary point

Doesn't the Hessian need to be checked to see whether it's a stationary point vs. an extremely unstable point? (i.e., a maximum). Ashi Starshade 18:17, 5 July 2007 (UTC)

There are two closely connected meanings of "stationary point". The meaning intended in the article is that x is a stationary point for the dynamical system x' = f(x) if f(x) = 0. Another meaning is that x is a stationary point for a function f if the derivative of f is zero; this is the meaning that the article stationary point mentions. I don't quite understand your comment, but it seems that you're confusing these two meanings. -- Jitse Niesen (talk) 20:50, 6 July 2007 (UTC)

[edit] Differentiable versus partial derivatives

Excuse me, but in this section it says

for instance, in the (1,1) direction (45°) this equals \sin 45^\circ=\sqrt{2}/2.

Should it not say something like

\sin (2(45^\circ))=1

since

f(r,θ) = rsin(2θ)

? Dimbulb15 (talk) 23:31, 3 February 2008 (UTC)

[edit] Is It Possible?

First off I will admit that I only got "C's" and occasionally "B's" in Multivariable Calculus and Vector Calculus classes; so maybe I simply don't know what I'm talking about. But early in the article it states that the Jacobian Matirix can exist even if the function is NOT differentiable at a point; that Partial (and not Total) Derivatives are sufficient for it's existence. I thought that for a function to be differentiable then it also had to be continuous at the point in question; i.e. that all derivatives (Partial and Total) would exist. I suppose, for example, that if Z = f(X,Y) and if Y is constant then there would only be dZ/dX and that dZ/dY would be zero and therefore that the total derivative would not exist at the point. Could someone please clarify this for me. Thanks.JeepAssembler (talk) 21:55, 2 March 2008 (UTC)JeepAssemblerJeepAssembler (talk) 21:55, 2 March 2008 (UTC)

Let H be the heaviside function. The let f(x,y)=H(xy). The partial derivatives exists at (0,0), but the function is not differentiable there. ObsessiveMathsFreak (talk) 11:00, 22 April 2008 (UTC)