User:Tobias Bergemann/Scratch
From Wikipedia, the free encyclopedia
Contents |
[edit] History of Scheme
[edit] Older standards
- RRS ("The Revised Report on Scheme", G.L. Steele et al, AI Memo 452, MIT, Jan 1978)
- R2RS ("The Revised Revised Report on the Algorithmic Language Scheme", Clinger, AI Memo 848, MIT Aug 1985)
- R3RS ("Dedicated to the Memory of ALGOL 60", Revised(3) Report on the Algorithmic Language Scheme)
- R4RS (Revised(4) Report on the Algorithmic Language Scheme)
R5RS and R6RS are already referenced from Scheme (programming language).
[edit] History of call/cc
- interaction with
dynamic-wind
(R5RS call/cc & dynamic-wind in terms of r4rs, faking dynamic-wind, dynamic-wind, A new specification for dynamic-wind, call/wc, and call/nwc, implementing dynamic-wind) - interaction with
values
andcall-with-values
[edit] Shannon entropy: characterization
Information entropy is characterised by these desiderata:
(Define and )
The measure should be continuous and symmetric — i.e., changing the value of one of the probabilities by a very small amount should only change the entropy by a small amount, and the measure should be unchanged if the outcomes xi are re-ordered:
- etc.
The measure should be maximal for uniformly distributed events (uncertainty is highest when all possible events are equiprobable):
For equiprobable events the entropy should increase with the number of outcomes:
The amount of entropy should be independent of how the process is regarded as being divided into parts.
This last functional relationship characterizes the entropy of a system with sub-systems. It demands that the entropy of a system can be calculated from the entropy of its sub-systems if we know how the sub-systems interact with each other.
Given an ensemble of n uniformly distributed elements which are arbitrarily divided into k boxes (sub-systems) with elements respectively, the entropy of the whole ensemble should be equal to the sum of the entropy of the system of boxes and the individual entropies of the boxes, each weighted with the probability of finding oneself in that particular box.
For positive integers bi where ,
Choosing the sub-ensembles to be of equal size, i.e. with , the additivity assumption implies that the total entropy is given by the sum of the uncertainty of choosing one of the k boxes and the uncertainty of choosing a specific event within the chosen box:
Choosing k = n, this implies that the entropy of a certain outcome is zero:
It can be shown that any definition of entropy satisfying these assumptions has the form
where K is a constant corresponding to a choice of measurement units.