Talk:Maximum entropy thermodynamics

From Wikipedia, the free encyclopedia

WikiProject Physics This article is within the scope of WikiProject Physics, which collaborates on articles related to physics.
??? This article has not yet received a rating on the assessment scale. [FAQ]
??? This article has not yet received an importance rating within physics.

Please rate this article, and then leave comments here to explain the ratings and/or to identify the strengths and weaknesses of the article.

Articles for deletion This article was nominated for deletion on 28 October 2005. The result of the discussion was keep.

[edit] From stub to article

To do:

  • technical note that strictly the entropy should be relative to a prior measure. -> Principle of minimum cross-entropy (Kullback-Leibler distance). In thermodynamics we usually assume the "principle of equal a-priori probability" over phase space, so the two are then equivalent.
  • section on philosophical implications regarding the conceptual problems of statistical mechanics, second law, etc.
-- (?) DONE Jheald 22:07, 2 November 2005 (UTC)
  • (?) some more algebra, and a simple nonequilibrium example (eg Brownian motion?)

-- Jheald 12:47, 28 October 2005 (UTC)

[edit] Introduction could be more friendly

(from the Article deletion page):

Note to author - please consider adding a few paragraphs up front in layman talk before getting on to the partial differentials. There ought to be something you can say about maximum entropy that I can slip into a casual conversation. Denni 23:56, 28 October 2005 (UTC)

[edit] Average entropy, measured entropy and entropy fluctuations

At the moment the article isn't very clear as to when it's talking about expectations (either as a constraint, or a prediction), and actual measurements. For example, in the discussion of the 2nd law, the measured macroscopic quantities probably won't come in bang on the nose of the predicted -- instead (we assume) they will be within the margin of predicted uncertainty.

This especially needs to be much cleaned up in the context of entropy, particularly if we're going to discuss the fluctuation theorem.

Also, the new measurements will therefore contain (a little) new information, over and beyond the predicted distribution. So it's not quite true that SI is unchanged. It will still be a constant, but strictly speaking it will become a different contstant, as we propagate back the new information, sharpening up our phase-space distribution for each instant back in time.

-- Jheald 15:51, 1 November 2005 (UTC)

-- DONE Jheald 22:07, 2 November 2005 (UTC)