Physics:Sonification

From HandWiki

File:Air Play - Sonification of Beijing Air Quality Data.webm Sonification is the use of non-speech audio to convey information or perceptualize data.[1] Auditory perception has advantages in temporal, spatial, amplitude, and frequency resolution that open possibilities as an alternative or complement to visualization techniques.

For example, the rate of clicking of a Geiger counter conveys the level of radiation in the immediate vicinity of the device.

Though many experiments with data sonification have been explored in forums such as the International Community for Auditory Display (ICAD), sonification faces many challenges to widespread use for presenting and analyzing data. For example, studies show it is difficult, but essential, to provide adequate context for interpreting sonifications of data.[1][2] Many sonification attempts are coded from scratch due to the lack of flexible tooling for sonification research and data exploration.[3]

History

The Geiger counter, invented in 1908, is one of the earliest and most successful applications of sonification. A Geiger counter has a tube of low-pressure gas; each particle detected produces a pulse of current when it ionizes the gas, producing an audio click. The original version was only capable of detecting alpha particles. In 1928, Geiger and Walther Müller (a PhD student of Geiger) improved the counter so that it could detect more types of ionizing radiation.

In 1913, Dr. Edmund Fournier d'Albe of University of Birmingham invented the optophone, which used selenium photosensors to detect black print and convert it into an audible output.[4] A blind reader could hold a book up to the device and hold an apparatus to the area she wanted to read. The optophone played a set group of notes: g c' d' e' g' b' c e. Each note corresponded with a position on the optophone's reading area, and that note was silenced if black ink was sensed. Thus, the missing notes indicated the positions where black ink was on the page and could be used to read.

Pollack and Ficks published the first perceptual experiments on the transmission of information via auditory display in 1954.[5] They experimented with combining sound dimensions such as timing, frequency, loudness, duration, and spatialization and found that they could get subjects to register changes in multiple dimensions at once. These experiments did not get into much more detail than that, since each dimension had only two possible values.

John M. Chambers, Max Mathews, and F.R. Moore at Bell Laboratories did the earliest work on auditory graphing in their "Auditory Data Inspection" technical memorandum in 1974.[6] They augmented a scatterplot using sounds that varied along frequency, spectral content, and amplitude modulation dimensions to use in classification. They did not do any formal assessment of the effectiveness of these experiments.[7]

In 1976, philosopher of technology, Don Ihde, wrote, "Just as science seems to produce an infinite set of visual images for virtually all of its phenomena--atoms to galaxies are familiar to us from coffee table books to science magazines; so 'musics,' too, could be produced from the same data that produces visualizations."[8] This appears to be one of the earliest references to sonification as a creative practice.

In the 1980s, pulse oximeters came into widespread use. Pulse oximeters can sonify oxygen concentration of blood by emitting higher pitches for higher concentrations. However, in practice this particular feature of pulse oximeters may not be widely utilized by medical professionals because of the risk of too many audio stimuli in medical environments.[9]

In 1992, the International Community for Auditory Display (ICAD) was founded by Gregory Kramer as a forum for research on auditory display which includes data sonification. ICAD has since become a home for researchers from many different disciplines interested in the use of sound to convey information through its conference and peer-reviewed proceedings.[10]

In May 2022, NASA reported the sonification (converting astronomical data associated with pressure waves into sound) of the black hole at the center of the Perseus galaxy cluster.[11][12]

Some existing applications and projects

Sonification techniques

Many different components can be altered to change the user's perception of the sound, and in turn, their perception of the underlying information being portrayed. Often, an increase or decrease in some level in this information is indicated by an increase or decrease in pitch, amplitude or tempo, but could also be indicated by varying other less commonly used components. For example, a stock market price could be portrayed by rising pitch as the stock price rose, and lowering pitch as it fell. To allow the user to determine that more than one stock was being portrayed, different timbres or brightnesses might be used for the different stocks, or they may be played to the user from different points in space, for example, through different sides of their headphones.

Many studies have been undertaken to try to find the best techniques for various types of information to be presented, and as yet, no conclusive set of techniques to be used has been formulated. As the area of sonification is still considered to be in its infancy, current studies are working towards determining the best set of sound components to vary in different situations.

Several different techniques for auditory rendering of data can be categorized:

An alternative approach to traditional sonification is "sonification by replacement", for example Pulsed Melodic Affective Processing (PMAP).[50][51][52] In PMAP rather than sonifying a data stream, the computational protocol is musical data itself, for example MIDI. The data stream represents a non-musical state: in PMAP an affective state. Calculations can then be done directly on the musical data, and the results can be listened to with the minimum of translation.

See also

References

  1. 1.0 1.1 Kramer, Gregory, ed (1994). Auditory Display: Sonification, Audification, and Auditory Interfaces. Santa Fe Institute Studies in the Sciences of Complexity. Proceedings Volume XVIII. Reading, MA: Addison-Wesley. ISBN 978-0-201-62603-2. https://archive.org/details/auditorydisplays00greg. 
  2. Smith, Daniel R.; Walker, Bruce N. (2005). "Effects of Auditory Context Cues and Training on Performance of a Point Estimation Sonification Task.". Journal of Applied Cognitive Psychology 19 (8): 1065–1087. doi:10.1002/acp.1146. 
  3. Flowers, J. H. (2005), "Thirteen years of reflection on auditory graphing: Promises, pitfalls, and potential new directions", in Brazil, Eoin, Proceedings of the 11th International Conference on Auditory Display, pp. 406–409, http://www.icad.org/Proceedings/2005/Flowers2005.pdf 
  4. Fournier d'Albe, E. E. (May 1914), "On a Type-Reading Optophone", Proceedings of the Royal Society of London 
  5. Pollack, I.; Ficks, L. (1954), "Information of elementary multidimensional auditory displays", Journal of the Acoustical Society of America 26 (1): 136, doi:10.1121/1.1917759, Bibcode1954ASAJ...26Q.136P 
  6. Chambers, J. M. and Mathews, M. V. and Moore, F. R. (1974), "Auditory Data Inspection", Technical Memorandum, 74-1214-20 
  7. Frysinger, S. P. (2005), "A brief history of auditory data representation to the 1980s", in Brazil, Eoin, Proceedings of the 11th International Conference on Auditory Display, pp. 410–413, http://www.icad.org/Proceedings/2005/Frysinger2005.pdf 
  8. Ihde, Don (2007-10-04). Listening and Voice: Phenomenologies of Sound, Second Edition. SUNY Press. p. xvi. ISBN 978-0791472569. 
  9. Craven, R M; McIndoe, A K (1999), "Continuous auditory monitoring—how much information do we register?", British Journal of Anaesthesia 83 (5): 747–749, doi:10.1093/bja/83.5.747, PMID 10690137, http://bja.oxfordjournals.org/content/83/5/747.full.pdf 
  10. Kramer, G.; Walker, B.N. (2005), "Sound science: Marking ten international conferences on auditory display", ACM Transactions on Applied Perception 2 (4): 383–388, doi:10.1145/1101530.1101531 
  11. Watzke, Megan; Porter, Molly; Mohon, Lee (4 May 2022). "New NASA Black Hole Sonifications with a Remix". NASA. https://www.nasa.gov/mission_pages/chandra/news/new-nasa-black-hole-sonifications-with-a-remix.html. Retrieved 11 May 2022. 
  12. Overbye, Dennis (7 May 2022). "Hear the Weird Sounds of a Black Hole Singing - As part of an effort to "sonify" the cosmos, researchers have converted the pressure waves from a black hole into an audible … something.". The New York Times. https://www.nytimes.com/2022/05/07/science/space/astronomy-black-hole-sound.html. Retrieved 11 May 2022. 
  13. Montgomery, E.T; Schmitt, R.W (1997), "Acoustic altimeter control of a free vehicle for near-bottom turbulence measurements", Deep Sea Research Part I: Oceanographic Research Papers 44 (6): 1077, doi:10.1016/S0967-0637(97)87243-3, Bibcode1997DSRI...44.1077M 
  14. Quincke, G. (1897). "Ein akustisches Thermometer für hohe und niedrige Temperaturen". Annalen der Physik 299 (13): 66–71. doi:10.1002/andp.18972991311. ISSN 0003-3804. Bibcode1897AnP...299...66Q. https://zenodo.org/record/1423936. 
  15. Ismailogullari, Abdullah; Ziemer, Tim (2019). "Soundscape clock: Soundscape compositions that display the time of day". International Conference on Auditory Display. 25. pp. 91–95. doi:10.21785/icad2019.034. ISBN 978-0-9670904-6-7. 
  16. (in en) LIGO Gravitational Wave Chirp, https://www.youtube.com/watch?v=TWqhUANNFXw, retrieved 2021-09-15 
  17. Hunt, A.; Hermann, T.; Pauletto, S. (2004). "Interacting with sonification systems: closing the loop". Proceedings. Eighth International Conference on Information Visualisation, 2004. IV 2004.. pp. 879–884. doi:10.1109/IV.2004.1320244. ISBN 0-7695-2177-0. http://www.informatik.uni-trier.de/~ley/db/conf/iv/iv2004.html#HuntHP04. 
  18. Thomas Hermann, and Andy Hunt. The Importance of Interaction in Sonification. Proceedings of ICAD Tenth Meeting of the International Conference on Auditory Display, Sydney, Australia, July 6–9, 2004. Available: online
  19. Sandra Pauletto and Andy Hunt. A Toolkit for Interactive Sonification. Proceedings of ICAD Tenth Meeting of the International Conference on Auditory Display, Sydney, Australia, July 6–9, 2004. Available: online.
  20. Kather, Jakob Nikolas; Hermann, Thomas; Bukschat, Yannick; Kramer, Tilmann; Schad, Lothar R.; Zöllner, Frank Gerrit (2017). "Polyphonic sonification of electrocardiography signals for diagnosis of cardiac pathologies". Scientific Reports 7: Article-number 44549. doi:10.1038/srep44549. PMID 28317848. Bibcode2017NatSR...744549K. 
  21. Edworthy, Judy (2013). "Medical audible alarms: a review". J Am Med Inform Assoc 20 (3): 584–589. doi:10.1136/amiajnl-2012-001061. PMID 23100127. 
  22. Woerdeman, Peter A.; Willems, Peter W.A.; Noordsmans, Herke Jan; Berkelbach van der Sprenken, Jan Willem (2009). "Auditory feedback during frameless image-guided surgery in a phantom model and initial clinical experience". J Neurosurg 110 (2): 257–262. doi:10.3171/2008.3.17431. PMID 18928352. 
  23. Ziemer, Tim; Black, David (2017). "Psychoacoustically motivated sonification for surgeons". International Journal of Computer Assisted Radiology and Surgery 12 ((Suppl 1):1): 265–266. doi:10.1007/s11548-017-1588-3. PMID 28527024. 
  24. Ziemer, Tim; Black, David; Schultheis, Holger (2017). "Psychoacoustic sonification design for navigation in surgical interventions". Proceedings of Meetings on Acoustics. 30. pp. 050005. doi:10.1121/2.0000557. 
  25. Ziemer, Tim; Black, David (2017). "Psychoacoustic sonification for tracked medical instrument guidance". The Journal of the Acoustical Society of America 141 (5): 3694. doi:10.1121/1.4988051. Bibcode2017ASAJ..141.3694Z. 
  26. Nagel, F; Stter, F R; Degara, N; Balke, S; Worrall, D (2014). "Fast and accurate guidance - response times to navigational sounds". International Conference on Auditory Display. 
  27. Florez, L (1936). "True blind flight". J Aeronaut Sci 3 (5): 168–170. doi:10.2514/8.176. 
  28. 28.0 28.1 28.2 Ziemer, Tim; Schultheis, Holger; Black, David; Kikinis, Ron (2018). "Psychoacoustical Interactive Sonification for Short-Range Navigation". Acta Acustica United with Acustica 104 (6): 1075–1093. doi:10.3813/AAA.919273. 
  29. 29.0 29.1 29.2 Ziemer, Tim; Schultheis, Holger (2018). "Psychoacoustic auditory display for navigation: an auditory assistance system for spatial orientation tasks". Journal on Multimodal User Interfaces 2018 (Special Issue: Interactive Sonification): 205–218. doi:10.1007/s12193-018-0282-2. https://rdcu.be/bbZEp. Retrieved 24 January 2019. 
  30. Mannone, Maria (2018). "Knots, Music and DNA". Journal of Creative Music Systems 2 (2). doi:10.5920/jcms.2018.02. https://www.jcms.org.uk/article/id/523/. 
  31. "SPDF - Sonification". 2005-11-13. http://spdf.gsfc.nasa.gov/research/sonification/sonification.html. 
  32. Hinckfuss, Kelly; Sanderson, Penelope; Loeb, Robert G.; Liley, Helen G.; Liu, David (2016). "Novel Pulse Oximetry Sonifications for Neonatal Oxygen Saturation Monitoring". Human Factors 58 (2): 344–359. doi:10.1177/0018720815617406. PMID 26715687. 
  33. Sanderson, Penelope M.; Watson, Marcus O.; Russell, John (2005). "Advanced Patient Monitoring Displays: Tools for Continuous Informing". Anesthesia & Analgesia 101 (1): 161–168. doi:10.1213/01.ANE.0000154080.67496.AE. PMID 15976225. 
  34. Schwarz, Sebastian; Ziemer, Tim (2019). "A psychoacoustic sound design for pulse oximetry". International Conference on Auditory Display. 25. pp. 214–221. doi:10.21785/icad2019.024. ISBN 978-0-9670904-6-7. 
  35. Schuett, Jonathan H.; Winton, Riley J.; Batterman, Jared M.; Walker, Bruce N. (2014). "Auditory weather reports". Proceedings of the 9th Audio Mostly: A Conference on Interaction with Sound. AM '14. New York, NY, USA: ACM. pp. 17:1–17:7. doi:10.1145/2636879.2636898. ISBN 9781450330329. 
  36. Polli, Andrea (July 6–9, 2004). "ATMOSPHERICS/WEATHER WORKS: A MULTI-CHANNEL STORM SONIFICATION PROJECT". ICAD 04-Tenth Meeting of the International Conference on Auditory Display. http://www.icad.org/websiteV2.0/Conferences/ICAD2004/papers/polli.pdf. 
  37. Yang, Jiajun; Hermann, Thomas (June 20–23, 2017). "PARALLEL COMPUTING OF PARTICLE TRAJECTORY SONIFICATION TO ENABLE REAL-TIME INTERACTIVITY". The 23rd International Conference on Auditory Display. http://icad.org/icad2017/icad2017_paper_22.pdf. 
  38. "Justin Joque". http://justinjoque.com/sonification.php. 
  39. Banf, Michael; Blanz, Volker (2013). "Sonification of images for the visually impaired using a multi-level approach". Proceedings of the 4th Augmented Human International Conference. New York, New York, USA: ACM Press. pp. 162–169. doi:10.1145/2459236.2459264. ISBN 9781450319041. 
  40. Banf, Michael; Mikalay, Ruben; Watzke, Baris; Blanz, Volker (June 2016). "PictureSensation – a mobile application to help the blind explore the visual world through touch and sound". Journal of Rehabilitation and Assistive Technologies Engineering 3: 205566831667458. doi:10.1177/2055668316674582. ISSN 2055-6683. PMID 31186914. 
  41. CURAT. "Games and Training for Minimally Invasive Surgery". University of Bremen. http://curat.informatik.uni-bremen.de/en/. 
  42. Winkler, Helena; Schade, Eve Emely Sophie; Kruesilp, Jatawan; Ahmadi, Fida. "Tiltification – The Spirit Level Using Sound". University of Bremen. https://sonification.uni-bremen.de. 
  43. Silberman, S. (February 6, 2012). “Inside the Mind of a Synaesthete”. PLOS ONE.
  44. Weidenfeld, J. September 28, 2013. "10 Cool Ways To Create Music With Technology". Listserve.
  45. Byrne, M. February 14, 2012. “With Images for Your Earholes, Sonified Wins Augmented Reality with Custom Synesthesia”. Vice / Motherboard
  46. "PriceSquawk". https://pricesquawk.com/market-sonification. 
  47. Barrass S. (2012) Digital Fabrication of Acoustic Sonifications, Journal of the Audio Engineering Society, September 2012. online
  48. Barrass, S. and Best, G. (2008). Stream-based Sonification Diagrams. Proceedings of the 14th International Conference on Auditory Display, IRCAM Paris, 24–27 June 2008. online
  49. Barrass S. (2009) Developing the Practice and Theory of Stream-based Sonification. Scan: Journal of Media Arts Culture, Macquarie University.
  50. Kirke, Alexis; Miranda, Eduardo (2014-05-06). "Pulsed Melodic Affective Processing: Musical structures for increasing transparency in emotional computation". Simulation 90 (5): 606. doi:10.1177/0037549714531060. 
  51. "Towards Harmonic Extensions of Pulsed Melodic Affective Processing – Further Musical Structures for Increasing Transparency in Emotional Computation". 2014-11-11. http://cmr.soc.plymouth.ac.uk/pubs/IJUC_EM_04_Kirke_V3.pdf. 
  52. "A Hybrid Computer Case Study for Unconventional Virtual Computing". 2015-06-01. http://www.oldcitypublishing.com/journals/ijuc-home/ijuc-issue-contents/ijuc-volume-11-number-3-4-2015/ijuc-11-3-4-p-205-226/. 

External links