Philosophy of information

From Wikipedia, the free encyclopedia

The philosophy of information (PI) is the area of research that studies conceptual issues arising at the intersection of computer science, information technology, and philosophy.

It includes: [1]

  1. the critical investigation of the conceptual nature and basic principles of information, including its dynamics, utilisation and sciences
  2. the elaboration and application of information-theoretic and computational methodologies to philosophical problems.


Contents

[edit] History

The philosophy of information (PI) has evolved from the Philosophy of artificial intelligence, logic of information, cybernetics, social theory, ethics and the study of language and information.

[edit] Logic of information

The logic of information, also known as the logical theory of information, considers the information content of logical signs and expressions along the lines initially developed by Charles Sanders Peirce.

[edit] Cybernetics

One source for the philosophy of information can be found in the technical work of Norbert Wiener, Alan Turing, William Ross Ashby, Claude Shannon, Warren Weaver, and many other scientists working on computing and information theory back in the early 1950s.

Some important work on information and communication was done by Gregory Bateson and his colleagues.

[edit] Study of language and information

Later contributions to the field were made by Fred Dretske, Jon Barwise, Brian Cantwell Smith, and others.

The Center for the Study of Language and Information (CSLI) was founded at Stanford University in 1983 by philosophers, computer scientists, linguists, and psychologists, under the direction of John Perry and Jon Barwise.

[edit] P.I.

More recently this field has become known as the philosophy of information. The expression was coined in the 1990s by Luciano Floridi, who has published prolifically in this area with the intention of elaborating a unified and coherent, conceptual frame for the whole subject.

[edit] Defining information

What the word information means depends on how it is defined.

[edit] Peirce

Peirce's concept of information serves to integrate the aspects of signs and expressions that are separately covered, on the one hand, by the concepts of denotation and extension, and on the other hand, by the concepts of connotation and comprehension.

[edit] Shannon

Claude E. Shannon, for his part, was very cautious: “The word ‘information’ has been given different meanings by various writers in the general field of information theory. It is likely that at least a number of these will prove sufficiently useful in certain applications to deserve further study and permanent recognition. It is hardly to be expected that a single concept of information would satisfactorily account for the numerous possible applications of this general field.” (Shannon 1993, p. 180). Thus, following Shannon, Weaver supported a tripartite analysis of information in terms of (1) technical problems concerning the quantification of information and dealt with by Shannon's theory; (2) semantic problems relating to meaning and truth; and (3) what he called “influential” problems concerning the impact and effectiveness of information on human behaviour, which he thought had to play an equally important role. And these are only two early examples of the problems raised by any analysis of information.

A map of the main senses in which one may speak of information is provided by the Stanford Encyclopedia of Philosophy article. The previous paragraphs are based on it.

[edit] Bateson

Gregory Bateson defined information as "a difference that makes a difference".[2]

[edit] Floridi

According to Floridi, four kinds of mutually compatible phenomena are commonly referred to as "information":

  • Information about something (e.g. a train timetable)
  • Information as something (e.g. DNA, or fingerprints)
  • Information for something (e.g. algorithms or instructions)
  • Information in something (e.g. a pattern or a constraint).

The word "information" is commonly used so metaphorically or so abstractly that the meaning is unclear.

[edit] Philosophical directions

[edit] Computing and philosophy

Recent creative advances and efforts in computing, such as semantic web, ontology engineering, knowledge engineering, and modern artificial intelligence provide philosophy with fertile notions, new and evolving subject matters, methodologies, and models for philosophical inquiry. While computer science brings new opportunities and challenges to traditional philosophical studies, and changes the ways philosophers understand foundational concepts in philosophy, further major progress in computer science would only be feasible when philosophy provides sound foundations for areas such as bioinformatics, software engineering, knowledge engineering, and ontologies.

Classical topics in philosophy, namely, mind, consciousness, experience, reasoning, knowledge, truth, morality and creativity are rapidly becoming common concerns and foci of investigation in computer science, e.g., in areas such as agent computing, software agents, and intelligent mobile agent technologies.

According to L. Floridi [3] one can think of several ways for applying computational methods towards philosophical matters:

  1. Conceptual experiments in silico: As an innovative extension of an ancient tradition of thought experiment, a trend has begun in philosophy to apply computational modeling schemes to questions in logic, epistemology, philosophy of science, philosophy of biology, philosophy of mind, and so on.
  2. Pancomputationalism (or the fallacy of a powerful metaphor): By this view, computational and informational concepts are considered to be so powerful that given the right Level of abstraction, anything in the world could be modeled and represented as a computational system, and any process could be simulated computationally. Then, however, pancomputationalists have the hard task of providing credible answers to the following two questions:
    1. how can one avoid blurring all differences among systems?
    2. what would it mean for the system under investigation not to be an informational system (or a computational system, if computation = information processing)?

[edit] Information and society

Philosophical studies of the social and cultural aspects of electronically mediated information have been carried out by numerous philosophers and other thinkers.

  • Albert Borgmann, Holding onto Reality: The Nature of Information at the Turn of the Millennium (Chicago University Press, 1999)
  • Mark Poster, The Mode of Information (Chicago Press, 1990)
  • Luciano Floridi, Informational Nature of Reality, Key Talk selected at the E-CAP conference 2006 (Trondheim, 2006)

[edit] Notes

  1. ^ Luciano Floridi, "What is the Philosophy of Information?", Metaphilosophy, 2002, (33), 1/2.
  2. ^ Extract from "Steps to an Ecology of Mind"
  3. ^ Luciano Floridi, Open Problems in the Philosophy of Information Metaphilosophy 35.4, 554-582. Revised version of The Herbert A. Simon Lecture on Computing and Philosophy given at Carnegie Mellon University in 2001, with realvideo and powerpoint presentation

[edit] See also

[edit] External links