WELCOME TO THE SCIENTIFIC SONIFICATION PROJECT!


Sound, alone or in combination with visual imaging techniques, offers a powerful means of transmitting information. It can significantly increase the bandwidth of the human/computer interface. This project is about sound: sound synthesis, rendering complex data sets in sounds, and visualizing sounds in a virtual-reality environment.

The Scientific Sonification project is based on the concept that sounds are dynamic events which evolve in a multidimensional space. Each sound is a superposition of component waves (partials), whose evolution is determined by a set of static and dynamic control parameters. By defining a mapping from the data space to the sound space, we create a formalism for the faithful rendition of data in sounds.

Our research focuses on establishing coordinate systems in sound space that optimize the aural perception of given data sets and their salient features. The data sets are taken from large-scale scientific computations; so far, we have rendered data from computational chemistry, materials science, and computer science.

The figure above shows an image of nine sounds created with DIASS and visualized with M4CAVE on a CAVE simulator. The component waves (partials) in each sound are represented by spheres; the diameter of a sphere is proportional to the amplitude of the partial, and its height is a measure of the frequency. The grid in the background serves to indicate the frequency spectrum; it spans eight octaves, corresponding approximately to the range of a piano. The colors of the spheres reflect the amount of reverberation in each partial.

DIASS and M4CAVE are part of a comprehensive “Environment for Music Composition,” a software system that includes, besides DIASS and M4CAVE, programs for computer-assisted composition and automatic music notation.

The following recent publications are available on the web:

  1. H. G. Kaper and S. Tipei, “Formalizing the Concept of Sound,” Proc. Int'l Computer Music Conference '99, Beijing (October 1999), pp. 387-390
  2. H. G. Kaper and S. Tipei, “Manifold Compositions, Music Visualization, and Scientific Sonification in an Immersive Virtual-Reality Environment,” in: Proc. Int'l Computer Music Conference '98, Ann Arbor, Michigan (October 1998), pp. 399-405
  3. H. G. Kaper, S. Tipei, and E. Wiebel, “Data Sonification and Sound Visualization ,” Computing in Science and Engineering, Vol. 1, No. 4 (July/August '99), 48-58
  4. H. G. Kaper, S. Tipei, and E. Wiebel, “High-Performance Computing, Music Composition, and the Sonification of Scientific Data,” Technical Report ANL/MCS-P690-0997

Contacts:

Hans G. Kaper, MCS Division
Argonne National Laboratory
Argonne, IL 60439
E-mail: [email protected]
(630) 252-7160
Sever Tipei, School of Music
University of Illinois at Urbana/Champaign
Urbana, IL 61801
E-mail: [email protected]
(217) 333-6689

[ MCS Division - Applied Mathematics | MCS Division - Research | MCS Division | Hans G. Kaper | Sever Tipei ]

Last update: October 21, 1998 (HGK)