Sonification

Sonification, a form of auditory display, is the use of non-speech audio to convey information or perceptualize data.[1] Auditory perception has high temporal and pressure resolution, which opens up possibilities for it as an alternative or complement to visualization techniques.

For example, the Geiger counter plays a sonification to help users know how much radiation is present. The number and frequency of audio clicks are directly dependent on the radiation level in the immediate vicinity of the device.

Though many experiments with data sonification are explored at forums such as the International Community for Auditory Display (ICAD), sonification faces many challenges to widespread use for presenting and analyzing data. It is difficult to provide adequate context for interpreting sonifications of data.[1] Also, there is not yet a flexible tool for sonification research and actual data exploration, meaning many sonification attempts are coded from scratch.[2]

Contents

History

The Geiger counter, invented in 1908, is one of the earliest and most successful applications of sonification. A Geiger counter has a tube of low-pressure gas; each particle detected produces a pulse of current when it ionizes the gas, producing an audio click. The original version was only capable of detecting alpha particles. In 1928, Geiger and Walther Müller (a PhD student of Geiger) improved the counter so that it could detect more types of ionizing radiation.

In 1913, Dr. E. E. Fournier d'Albe of Birmingham University invented the optophone, which used selenium photosensors to detect black print and convert it into an audible output which could be interpreted by a blind.[3] A blind reader could hold a book up to the device and hold an apparatus to the area she wanted to read. The optophone played a set group of notes: g c' d' e' g' b' c'' e''. Each note corresponded with a position on the optophone's reading area, and that note was silenced if black ink was sensed. Thus, the missing notes indicated the positions where black ink was on the page and could be used to read.

Pollack and Ficks published the first perceptual experiments on the transmission of information via auditory display in 1954.[4] They experimented with combining sound dimensions such as timing, frequency, loudness, duration, and spacialization and found that they could get subjects to register changes in multiple dimensions at once. These experiments did not get into much more detail than that, since each dimension had only two possible values.

John M. Chambers, Max Mathews, and F.R. Moore at Bell Laboratories did the earliest work on auditory graphing in their "Auditory Data Inspection" technical memorandum in 1974.[5] They augmented a scatterplot using sounds that varied along frequency, spectral content, and amplitude modulation dimensions to use in classification. They did not do any formal assessment of the effectivenes of these experiments.[6]

In the 1980s, pulse oximeters came in to widespread use. Pulse oximeters can sonify oxygen concentration of blood by emitting higher pitches for higher concentrations. However, in practice this particular feature of pulse oximeters may not be widely utilized by medical professionals because of the risk of too many audio stimuli in medical environments.[7]

In 1992, the International Community for Auditory Display (ICAD), was founded in 1992, which provides an annual conference for all research related to auditory display and features many papers and presentations on sonification. ICAD has given a home to auditory display researchers, who come from many different disciplines, through its conference and peer-reviewed proceedings.[8]

Some existing applications and projects

Sonification techniques

Many different components can be altered to change the user's perception of the sound, and in turn, their perception of the underlying information being portrayed. Often, an increase or decrease in some level in this information is indicated by an increase or decrease in pitch, amplitude or tempo, but could also be indicated by varying other less commonly used components. For example, a stock market price could be portrayed by rising pitch as the stock price rose, and lowering pitch as it fell. To allow the user to determine that more than one stock was being portrayed, different timbres or brightnesses might be used for the different stocks, or they may be played to the user from different points in space, for example, through different sides of their headphones.

Many studies have been undertaken to try to find the best techniques for various types of information to be presented, and as yet, no conclusive set of techniques to be used has been formulated. As the area of sonification is still considered to be in its infancy, current studies are working towards determining the best set of sound components to vary in different situations.

Several different techniques for rendering auditory data representations can be categorized:

References

  1. ^ a b Kramer, Gregory, ed (1994). Auditory Display: Sonification, Audification, and Auditory Interfaces. Santa Fe Institute Studies in the Sciences of Complexity. Proceedings Volume XVIII. Reading, MA: Addison-Wesley. ISBN 0201626039. 
  2. ^ Flowers, J. H. (2005), Brazil, Eoin, ed., "Thirteen years of reflection on auditory graphing: Promises, pitfalls, and potential new directions", Proceedings of the 11th International Conference on Auditory Display (ICAD2005): 406–409, http://www.icad.org/Proceedings/2005/Flowers2005.pdf 
  3. ^ d'Albe, E. E. Fournier (May 1914), "On a Type-Reading Optophone", Proceedings of the Royal Society of London, doi:10.1098 
  4. ^ Pollack, I. and Ficks, L. (1954), "Information of elementary multidimensional auditory displays", Journal of the Acoustical Society of America 
  5. ^ Chambers, J. M. and Mathews, M. V. and Moore, F. R. (1974), "Auditory Data Inspection", Technical Memorandum 74-1214-20 
  6. ^ Frysinger, S. P. (2005), Brazil, Eoin, ed., "A brief history of auditory data representation to the 1980s", Proceedings of the 11th International Conference on Auditory Display (ICAD2005) (Department of Computer Science and Information Systems, University of Limerick): 410-413 
  7. ^ "Continuous auditory monitoring--how much information do we register?", British Journal of Anaesthesia 83 (5): 747-749, 1999, doi:10.1093/bja/83.5.747, http://bja.oxfordjournals.org/content/83/5/747.full.pdf 
  8. ^ Kramer, G. and Walker, B.N. (2005), "Sound science: Marking ten international conferences on auditory display", ACM Transactions on Applied Perception (TAP) 2 (4): 383–388, http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.88.7945&rep=rep1&type=pdf 
  9. ^ Thomas Hermann, Andy Hunt, and Sandra Pauletto. Interacting with Sonification Systems: Closing the Loop. Eighth International Conference on Information Visualisation (IV'04) : 879-884. Available: online. DOI= http://doi.ieeecomputersociety.org/10.1109/IV.2004.1320244.
  10. ^ Thomas Hermann, and Andy Hunt. The Importance of Interaction in Sonification. Proceedings of ICAD Tenth Meeting of the International Conference on Auditory Display, Sydney, Australia, July 6–9, 2004. Available: online
  11. ^ Sandra Pauletto and Andy Hunt. A Toolkit for Interactive Sonification. Proceedings of ICAD Tenth Meeting of the International Conference on Auditory Display, Sydney, Australia, July 6–9, 2004. Available: online.
  12. ^ Stephen Barrass. Developing the Practice and Theory of Stream-based Sonification. Journal of Media Arts Culture, scan, Available: online

See also

External links