Sound and music computing

Summary

Sound and music computing (SMC) is a research field that studies the whole sound and music communication chain from a multidisciplinary point of view. By combining scientific, technological and artistic methodologies it aims at understanding, modeling and generating sound and music through computational approaches.

History edit

The Sound and Music Computing research field can be traced back to the 1950s, when a few experimental composers, together with some engineers and scientists, independently and in different parts of the world, began exploring the use of the new digital technologies for music applications. Since then the SMC research field has had a fruitful history and different terms have been used to identify it. Computer Music and Music Technology might be the terms that have been used the most, "Sound and Music Computing" being a more recent term. In 1974, the research community established the International Computer Music Association and the International Computer Music Conference. In 1977 the Computer Music Journal was founded. The Center for Computer Research in Music and Acoustics (CCRMA) at Stanford University was created in the early 1970s and the Institute for Research and Coordination Acoustic/Music (IRCAM) in Paris in the late 1970s.

The Sound and Music Computing term was first proposed in the mid 1990s [1] and it was included in the ACM Computing Classification System. Using this name, in 2004 the Sound and Music Computing Conference was started and also in 2004 a roadmapping initiative was funded by the European Commission that resulted in the SMC Roadmap [2] and in the Sound and Music Computing Summer School.

With increasing research specialization within the SMC field, a number of focused conferences have been created. Particularly relevant are the International Conference on Digital Audio Effects, established in 1998, the International Conference on Music Information Retrieval (ISMIR), established in 2000, and the International Conference on New Interfaces for Musical Expression (NIME), established in 2001.

Subfields edit

The current SMC research field can be grouped into a number of subfields that focus on specific aspects of the sound and music communication chain.

  • Processing of sound and music signals: This subfield focuses on audio signal processing techniques for the analysis, transformation and resynthesis of sound and music signals.
  • Understanding and modeling sound and music: This subfield focuses on understanding and modeling sound and music using computational approaches. Here we can include Computational musicology, Music information retrieval, and the more computational approaches of Music cognition.
  • Interfaces for sound and music: This subfield focuses on the design and implementation of computer interfaces for sound and music. This is basically related to Human Computer Interaction.
  • Assisted sound and music creation: This subfield focuses on the development of computer tools for assisting Sound design and Music composition. Here we can include traditional fields like Algorithmic composition.

Areas of application edit

SMC research is a field driven by applications. Examples of applications are:

  • Digital music instruments: This application focuses on musical sound generation and processing devices. It encompasses simulation of traditional instruments, transformation of sound in recording studios or at live performances and musical interfaces for augmented or collaborative instruments.
  • Music production: This application domain focuses on technologies and tools for music composition. Applications range from music modeling and generation to tools for music post–production and audio editing.
  • Music information retrieval: This application domain focuses on retrieval technologies for music (both audio and symbolic data). Applications range from music audio–identification and broadcast monitoring to higher–level semantic descriptions and all associated tools for search and retrieval.
  • Digital music libraries: This application places particular emphasis on preservation, conservation and archiving and the integration of musical audio content and meta–data descriptions, with a focus on flexible access. Applications range from large distributed libraries to mobile access platforms.
  • Interactive multimedia systems: These are for use in everyday appliances and in artistic and entertainment applications. They aim to facilitate music–related human–machine interaction involving various modalities of action and perception (e.g. auditory, visual, olfactory, tactile, haptic, and all kinds of body movements) which can be captured through the use of audio/visual, kinematic and bioparametric (skin conduction, temperature) devices.
  • Auditory interfaces: These include all applications where non–verbal sound is employed in the communication channel between the user and the computing device. Auditory displays are used in applications and objects that require monitoring of some type of information. Sonification is used as a method for data display in a wide range of application domains where auditory inspection, analysis and summarisation can be more efficient than traditional visual display. Sonic interaction design emphasizes the role of sound in interactive contexts.
  • Augmented action and perception: This refers to tools that increase the normal action and perception capabilities of humans. The system adds virtual information to a user's sensory perception by merging real images, sounds, and haptic sensation with virtual ones. This has the effect of augmenting the user's sense of presence, and of making possible a symbiosis between her view of the world and the computer interface. Possible applications are in the medical domain, manufacturing and repair, entertainment, annotation and visualization, and robot tele-operation.

See also edit

External links edit

Research centers edit

  • Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT) Montreal, Canada
  • Institute de Recherche et Coordination Acoustique/Musique (IRCAM) Paris, France
  • GRAME - National Center for Music Creation, Lyon, France
  • Sound & Music Computing, Aalborg University Copenhagen, Denmark
  • Audio Analysis Lab, Aalborg University, Denmark
  • Music Technology Group, Universitat Pompeu Fabra, Barcelona, Spain
  • International Audio Labs Erlangen, Joint Institution of Friedrich-Alexander-Universität Erlangen-Nürnberg and Fraunhofer IIS, Germany
  • Centre for Digital Music (C4DM), Queen Mary, University of London, London, UK
  • Center for Computer Research in Music and Acoustics (CCRMA) Stanford University, USA
  • The Music Computing Lab, The Open University, Milton Keynes, UK
  • Centro di Sonologia Computazionale (CSC), University of Padova, Padova, IT
  • Laboratorio di Informatica Musicale (LIM), Università degli Studi di Milano, Milano, IT
  • Institute for Electronic Music and Acoustics (IEM), University for Music and Dramatic Arts Graz, Austria
  • Center For New Music and Audio Technologies (CNMAT), UC Berkeley, USA
  • Sound and Music Computing, CSC School of Computer Science and Communication, KTH Royal Institute of Technology, Stockholm, Sweden
  • Music Informatics Research Group, School of Informatics, City University London, London, UK
  • Interdisciplinary Centre for Computer Music Research, Faculty of Arts, University of Plymouth, Plymouth, UK
  • Sound & Music Computing Lab, School of Computing, National University of Singapore, Singapore
  • Mexican Center for Music and Sonic Arts, Morelia, Mexico

Associations edit

  • International Society for Music Information Retrieval (ISMIR)
  • International Computer Music Association (ICMA)

Journals edit

  • Computer Music Journal
  • Journal of New Music Research
  • Organized Sound

Conferences edit

  • Sound and Music Computing Conference (SMC)
  • International Conference on Music Information Retrieval (ISMIR)
  • International Conference on New Interfaces for Musical Expression (NIME)
  • International Conference on Digital Audio Effects (DAFX)
  • International Computer Music Conference (ICMC)

Open software tools edit

  • List of software tools related to SMC

Undergraduate Programmes edit

  • Computing, Audio and Music Technology BSc (Hons), University of Plymouth, UK

MSc Programmes edit

  • MSc in Sound & Music Computing, Queen Mary, University of London, UK
  • MSc in Sound & Music computing, Universitat Pompeu Fabra, Barcelona, Spain
  • MSc in Sound & Music Computing, Aalborg University, Denmark

References edit

  1. ^ Camurri, A., De Poli, G., and Rocchesso, D. (1995). A taxonomy for Sound and Music Computing. Computer Music Journal, 19(2):4–5.
  2. ^ The S2S2 Consortium (2007). A Roadmap for Sound and Music Computing. Version 1.0. ISBN 978-90-811896-1-3.