Aura [Dannenberg and Brandt, 1996b,Dannenberg, 2004,Dannenberg and Brandt, 1996a] is a framework for software processing of audio and music signals mainly developed by Roger B. Dannenberg as a generalization of the CMU MIDI Toolkit. Although Aura has been under development for some years the author considers that it is still not mature and stable enough to offer it publicly mainly because of its lack of a packaging infrastructure and appropriate documentation. Nevertheless, the author plans on offering it as Free Software.
The first versions of Aura ran on MS Windows but the latest ones work on GNU/Linux using Port Audio, Port MIDI and wxWidgets for the graphical interface. Although the use of system-dependent features is sometimes necessary for real-time performance, Aura aims at being as much portable as possible, encapsulating all system-dependent code for things such as graphics, MIDI or audio. On the other hand, Aura tries to reuse as much code as possible from previous efforts by the author and intends to be itself reusable.
Aura offers a way to create, connect and control signal processing modules. It was designed with audio processing in mind and offers support to multiple threads and low-latency real-time audio computation.
The framework was in its first versions divided into an architecture for audio signal processing, which was called Aura, and a framework for building event-driven real-time software called W (see [Dannenberg and Rubine, 1995] for a description of the framework in its inital versions). W was designed as a framework for general real-time computing. W was a successor of V, itself a successor of Garnet, a constraint system written in Lisp. In a musical context, W was used for handling user interaction and MIDI data. The combination of W and Aura allowed operations such as controlling synthesis software from a MIDI keyboard. But in its latest versions (see [Dannenberg, 2004]) the name Aura is used to refer to the whole framework including the features and functionalities inherited from W.
The Aura model involves objects that send messages to other objects. Objects have typed attributes (integer, double-precision float...). Some objects are made by the users while others are part of the standard Aura environment (debugging tools, I/O ports for MIDI or audio, and interface components).
The basic concepts in Aura are the unit generator (or ugen) and the instrument. A unit generator is the encapsulation of a DSP algorithm in an object that has inputs, outputs and some sort of internal state. The concept is borrowed from the Music N paradigm (see 2.6.1) and it is shared by most other environments reviewed in this chapter. An instrument is a static composition of unit generators that is designed on pre-compile time.
According to the author, all of the previously existing solutions suffer from focusing on a single paradigm: graphical edition or textual-based programming and control. Graphical systems are easy to use but make it difficult to reconfigure at run-time. Textual systems offer more flexibility but lose some intuitive feel and debugging support. Aura offers both graphical edition and textual programming in an attempt to offer the best from both worlds.
But more than textual vs. graphical the author is interested in static vs. dynamic systems. Static connections can be more efficient but are far less flexible. Aura 2 supports from fully static to fully dynamic systems. Static graphs are created with the visual editor forming instruments members of the Aura Instr class. The instruments then can be dynamically allocated and connected at run time.
Although the first versions of Aura intended to offer a computing graph as dynamic as possible, experience indicated that most designs were mainly static. By using static instruments the main advantages are: more efficiency avoiding the overhead of dynamic patching; better debugging; inlining and other tricks can help improve performance; and users can reason better about their code when it is static.
In dynamic graphs, when graphs are updated Aura recalculates the execution order. It also uses reference counting to delete objects that are no longer referenced. Aura also uses global names for instruments and a remote procedure-call system. Instruments can be controlled from process running asynchronously.
The Aura visual editor is used to integrate unit generators into instruments and it is integrated with text-based programming. When the user creates a new instrument, the editor can automatically update the user's makefile and write scripting language functions to create an instance of the instrument, integrating the new instrument automatically into the user's program.
After the user creates a valid instrument, the editor can generate its C++ implementation. A set method is generated for each signal input to an instrument. The user can then make a remote procedure call to this method using C++ or Serpent (Aura's real-time scripting language). This can be done dynamically. A topological sort is performed on the instrument graph so that signal flows from input to output in a single pass. Buffers are allocated for intermediate results and they are reused whenever possible to minimize storage. Although it is assumed that an optimized buffer allocation policy is necessary to have efficient DSP applications, the author made exhaustive testing to find out that this only slightly matters for large collections of unit generators.
Aura supports different kinds of signals and the editor helps by selecting compatible types and unit generators. The types are:
Furthermore the Aura graphical Editor is capable of automatically generating a visual interface for any new instrument in order to test and debug it.
Messages are time-stamped with the time stamp indicating when the message should be received. An object can send a message to a particular target or simply send it to all objects that have been connected to its outputs. These connections are created on run-time.
Aura supports a model of computation based on fixed-priority scheduling and the notion that all computation within an object should run at the same priority. Each object is assigned a priority level called a zone. Computation of different objects within a zone is non-preemptive and runs on a single thread. E.g. an application can be divided in three zones, one for GUI objects, one for MIDI objects and the other for audio objects. It is up to the application designer to insure that the total computation in a zone (and all higher priority zones) does not exceed the shortest latency required in that zone but Aura can assist in measuring computation times and detecting exceptions. Messages are delivered synchronously within a zone. Messages between zones are sent asynchronously. All interzone messages are enqueued by the sender and later delivered to the receiver.
A message stamped for the future is usually stored until its time comes. But a precomputation zone may run ahead of real time, reading incoming messages before it otherwise would, it can then compute and send messages to a real-time zone where they are automatically buffered. (E.g. a precomputation zone can be used for reading audio or video files).
For efficiency, data is usually calculated in blocks of 32 samples. According to the author Aura gains roughly a factor-of-two performance increase over previous systems. In particular CSound (see 2.6.1) does not use interpolation to smooth control signals so users must compensate by computing with short inefficient sample blocks. And the ISPW (see 2.5) does not offer low sample rates for control signals so it computes twice as many samples as Aura to achieve the same result. But, as the author acknowledges, these other systems are more mature an offer a more complete library.