Various musicians and artists have used mathematical models for artistic means. Typically a set of mappings is defined which transform the numerical outputs of sets of equations or computer simulation into parameters in visual or sonic space. This is usually motivated by an aesthetic appreciation of the dynamics of the model and an implicit assumption that these formal structures can be appreciated in perceptual space. Visualisation is integral to Alife research and many models can be naturally represented in two or three dimensional space over time, but it is less obvious that the same dynamics can be succesfully conveyed in sound. The aim of this project was to test whether this assumption holds.

Cellular automata have been used extensively in algorithmic composition. One of the simplest CA models, which is used here, is a 1D model where each cell takes on a binary value. The system is usually visualised by plotting the state of each successive iteration as horizontal lines, one below the other. Rule sets can be categorised mathematically (according to a measure of entropy variance in the look up table (Wuenshce)) into one of three classes: ordered, complex or chaotic.

These classes can be readily distinguished from a graphical representation of the global CA states.

This study investigated whether people could classify the qualitatively different states produced by different cellular automata rule sets when their outputs were represented sonically.

ordered complex chaotic
listen listen listen

I developed mappings which defined rhythmic and harmonic variation according to individual cell states and entropy variance for 1 dimensional CA rules.

Rhythmic mapping


The rhythmic mapping transforms spatial patterns into temporal patterns by mapping cell state to note status. The 1D array of cell states is read left to right and four lines were voiced simultaneously at different pitches. The spatio-temporal mappings preserves Gestalt properties that are thought to be key to pattern perception such a grouping by proximity.

Harmonic mapping


The harmonic mapping determines the pitch of the each note according to the frequency distribution of the rule look-up table which is updated each iteration. Because the statistical distributions vary qualitatively with each rule type, this mapping produces chords, and chord sequences that differ characteristically.


Listeners were required to classify each sequence as either chaotic, ordered, or complex, using either audio only, visual, or audio-visual displays.

Musicians and non-musicans were tested, and although both groups found it most difficult to make correct classifciations using the audio only display, they all made correct classifications significantly greater than chance. Musicians performed significantly better than non-musicians in the audio only condition.

This experiment is written up in the context of sonification in Issues in Auditory Display
For a discussion of the relevance of Artificial Life models to composers see chapter iv of my thesis.

Made at Creative Systems Lab and CCNR, Sussex University.

© ecila. 2010