Highly optimized tolerance

From Wikipedia, the free encyclopedia

In complex systems research, highly optimized tolerance is "a general framework for studying complexity", in the words of J. M. Carlson (of the University of California, Santa Barbara) and John Doyle (of the California Institute of Technology). Less recently, in Reference 3, they defined highly optimized tolerance as "a mechanism for complexity based on robustness tradeoffs in systems subject to uncertain environments." Doyle and Carlson have been the main proponents of highly optimized tolerance.

A forest fire
A forest fire

In Reference 3, Doyle and Carlson wrote that probability-loss-resource problems are the "simplest examples" of highly optimized tolerance. They point to data compression, the world wide web, and forest fires as providing applications for the probability-loss-resource problem. Generally the objective is to minimize an equation which describes the expected cost of a sum of events. (The events in the first aforementioned application may be the occurrence of source symbols; the events in the second may be file accesses; and the events in the third may be fire ignition and propagation.)

[edit] References

  1. Carlson, J. M. & Doyle, J. (1999) Phys. Rev. E 60, 1412–1427.
  2. Carlson, J. M. & Doyle, J. (2000) Phys. Rev. Lett. 84, 2529–2532.
  3. Doyle, J. & Carlson, J. M. (2000) Phys. Rev. Lett. 84, 5656–5659.
  4. Zhou, T. & Carlson, J. M. (2000), Phys. Rev. E 62, 3197–3204.
  5. Robert, C., Carlson, J. M. & Doyle, J. (2001) Phys. Rev. E 63, 56122, 1–13.
  6. Zhou, T., Carlson, J. M. & Doyle, J. (2002) Proc. Natl. Acad. Sci. USA 99, 2049–2054.
  7. Greene, K. (2005) Science News 168, 230.