This week we revisit a research project developed by Goldsmiths’ Mick Grierson in collaboration with Sound and Music, Whitefield Schools and Centre and the London Philharmonic Orchestra.
LumiSonic is a sound application that visualises sound in real-time in a way that allows hearing-impaired individuals to interact with a specifically designed, graphical representation of sound.
Inspired equally by experimental film tradition and neuroscientific research, Lumisonic aims to help those with hearing difficulties have a better understanding of sound.
The Lumisonic iPhone app, created by Goldsmiths Creative Computing and Strangeloop, is a “proof of concept” visualisation tool that generates a filtered visual display based on live sound. Sound is transformed in real-time through the implementation of a Fast Fourier Transform (FFT) and translated into a moving image. The aesthetic and conceptual approach was informed by research in both visual perception and experimental film. Concentric ring formations are a common feature in both visual hallucinations, and experimental animation. Research suggests this is due to the structure of the visual system. Therefore this method of representing sound could be more effective than linear ‘graphic’ approaches.