The MetriX framework was ported to CLAM. This meant on one hand fitting the MetriX object-oriented music model to the DSPOOM metamodel. As it will be seen in this section, and has already been suggested earlier on in this same chapter, this adaption was completely natural. On the other hand, adapting MetriX to the CLAM framework implied gaining XML representation ``for free''. Therefore, the substitution of the textual format for XML was also natural. The overall process gave place to MetriXML.
The first thing to do when adapting a previously existing model to the DSPOOM metamodel is to identify what metaclass each of the model elements belong to. In our case the basic model classes that should find a DSPOOM metaclass are: Instrument, Generator, Note, Event, and Score.
The relation of the first two classes with the DSPOOM metamodel are as it was already illustrated in figure 6.4: an Instrument is a ProcessingComposite made up of Processing objects that are instances of the Generator subclass. Then the question is how the Instrument definition expressed in MIDL (see section 6.4.2 ) is related to the metamodel. The answer is that an Instrument definition is a DSPOOM Configuration that will be used to configure the Instrument Processing object before its execution. Note that this configuration will also be in charge of ``subclassifying'' the generic Instrument to a particular one as shown in figure 6.1.
The other subelements present in an Instrument such as the Timbre Space or the Break Point Functions (see figure 6.10) will have no direct interpretation in the DSPOOM metamodel. On execution time they play a secondary role as auxiliary mechanisms of the Instrument and on configuration time they will already be represented by fields in the overall configuration.
In a similar way, the Note class is seen as a conceptual shortcut with no direct interpretation in the DSPOOM metamodel. As a matter of fact a Note corresponds to the internal state of each of the Generators. As such we should be able to query a Generator for its Note-state, but no more direct access nor representation is needed.
It is clear that a music Event should somehow relate to a DSPOOM Control. But the representation of an Event in a Score, which includes for instance a time tag, is too complex to make it directly compatible with the simple asynchronous control mechanism based on simple data types that is included in the DSPOOM metamodel. The solution is the same that is usually recommended when this situation arises in modeling a particular system in DSPOOM (see section 4.1.1). First, a related Processing Data class must be defined. In our case, the Event class will therefore be a Processing Data. Then we need a special sink Processing object class that receives this incoming Processing Data and converts it into DSPOOM control events. In our case we will call this class the Scheduler.
The Scheduler is the class responsible for receiving input Events as Processing Data, enqueueing them if necessary, and firing DSPOOM control events when it corresponds according to their time tag and the system current time.
Finally, the Score class contains Events that have already been defined as Processing Data objects. It naturally derives that the Score is also a Processing Data class. Figure 6.11 illustrates the class diagram of the MetriXML model in terms of the DSPOOM metamodel.
As explained in section 3.2.2 any Configuration implemented in CLAM has automatic XML passivation services.
In the following subsections we will illustrate the process of converting the MetriX textual formats into XML and finding the Object-Oriented Model related to them. For doing so, both for the MetriX Instrument Definition Language and MetriX Score Definition Language we will follow the following steps: