Controlling the speed of evolution. This can either be applied globally to all
parts of the spectrum, or individually in which case the different portions of
the spectrum phase against each other. An extreme and interesting case is to
freeze on a single FFT frame.
Adjusting the ratios within spectral subsets. This feature which might be
described as ‘spectral companding’ creates a warping effect on the spectrum
(and formants if any).
Locking all oscillators to the frequency evolution of a selected bin, according
to a data set of transpositions, to ‘harmonize’ the selected oscillator.
Selecting a non-sinusoidal waveform for resynthesis – a distortion effect
adding many harmonics to the spectrum.
Triggering autonomous algorithmic traversal schemes. This allows the
performance to proceed in unpredictable ways, especially with behaviors
which would be difficult or impossible for a human performer to produce.
Further, it should be noted that with the 7 bit resolution of MIDI ‘continuous’
controllers, it is impossible to smoothly navigate analysis sets containing more
than 128 frames. (Ours contain several thousand frames.)
Network Distribution with UDP Packets
The computer running the midiroute program encodes MIDI control data into tagged
numbers which are then broadcast to the local network. Each datum is sent through a
UDP (User Datagram Protocol) packet to its broadcast destination. Each UDP
packet is encapsulated into one ethernet packet under normal conditions and
therefore the UDP packets should be received simultaneously by each computer
in the local network. This mechanism may be extended into the worldwide
network.
In order to communicate with groups of computers in different network locations, we
add a relay computer for each group. This relay computer receives IP packets from the
midiroute computer and broadcasts UDP packets to its local network.
Latency, Synchronization Error and Bandwidth
Latency is the delay time between actual performance gesture input by the performer
and the change of sound in the space. Latency must be kept as low as possible in a
musical performance system. In our system, the average latency is around 100 msec. The
main source of latency is an eight kilobyte sound output buffer in the Windows
environment. The impact of other sources of latency, such as network delay or
re-synthesis delay, is marginal. Sound output latency should improve with newer
generations of sound hardware.
Synchronization error is the difference in latencies within a group of computers
receiving the same control data. If the differences are low, errors are noticed as spatial
effects (randomized delays between channels) as heard in our system, but if