Howard, D.M. and Rimell, S. (2004) Real-time gesture-controlled physical modelling music synthesis with tactile feedback, EURASIP. Journal of Applied Signal Processing (Special Issue on Model-based sound synthesis), 7. pp. 1001-1006. ISSN 1110-8657Full text not available from this repository.
Electronic sound synthesis continues to offer huge potential possibilities for the creation of new musical instruments. The traditional approach is, however, seriously limited in that it incorporates only auditory feedback and it will typically make use of a sound synthesis model (e.g., additive, subtractive, wavetable, and sampling) that is inherently limited and very often nonintuitive to the musician. In a direct attempt to challenge these issues, this paper describes a system that provides tactile as well as acoustic feedback, with real-time synthesis that invokes a more intuitive response from players since it is based upon mass-spring physical modelling. Virtual instruments are set up via a graphical user interface in terms of the physical properties of basic well-understood sounding objects such as strings, membranes, and solids. These can be interconnected to form complex integrated structures. Acoustic excitation can be applied at any point mass via virtual bowing, plucking, striking, specified waveform, or from any external sound source. Virtual microphones can be placed at any point masses to deliver the acoustic output. These aspects of the instrument are described along with the nature of the resulting acoustic output.
|Institution:||The University of York|
|Academic Units:||The University of York > Electronics (York)|
|Depositing User:||York RAE Import|
|Date Deposited:||08 May 2009 13:16|
|Last Modified:||08 May 2009 13:16|
|Publisher:||European Association for Speech Signal and Image Processing|