In this chapter we have presented an object-oriented music model that can be interpreted as an instance of the basic Digital Signal Processing Object-Oriented Metamodel dealing in this case with higher-level symbolic musical data.
Following again the object-oriented paradigm we model a music system as a set of interrelated objects. These objects will in general belong to one of the following abstract classes: Instrument, Generator, Note or Score.
An Instrument is a generating Processing object that receives input controls and generates an output sound. An Instrument is as a matter of fact a logical grouping of autonomous units named Generators. A Generator is the atomic sound producing unit in an Instrument and can be independently controlled from the other Generators (although it often receives their influence). Examples are the six strings in a guitar or each of the keys in a piano.
A Note is the actual sounding object attached to each Generator. A Note can be turned on and off and its properties depend on the internal state of its associated Generator and Instrument.
Finally the internal state of the whole object-oriented music system changes in response to events that are sent to particular Instruments or Generators. A time-ordered collection of such events is known as a Score.
The abstract model just described is implemented in the MetriX language or in its XML-based version MetriX-ML. MetriX-ML is a Music-N language that therefore offers a way of defining both Instruments and Scores. It is implemented in CLAM and, apart from the concepts previously presented, includes support for defining timbre spaces, break-point-functions and relations between control parameters in an Instrument.
Therefore the model and its implementation is fully usable. Nevertheless the model is not complete as it has not been adapted to different synthesis techniques apart from Spectral Modelling Synthesis. But its definition and implementation in the context of the CLAM framework make it extensible and adaptable to future needs.