Category Archives: Videos

Throwback Thursday: iPhone gesture recognition

Back in 2010, Marco Klingmann (Msc Cognitive Computing) wrote about his iPhone gesture recognition project…

“The growing number of small sensors built into consumer electronic devices, such as mobile phones, allow experiments with alternative interaction methods in favour of more physical, intuitive and pervasive human computer interaction.

“This research examines hand gestures as an alternative or supplementary input modality for mobile devices.

“The iPhone is chosen as sensing and processing device. Based on its built-in accelerometer, hand movements are detected and classified into previously trained gestures. A software library for accelerometer-based gesture recognition and a demonstration iPhone application have been developed. The system allows the training and recognition of free-from hand gestures.

“Discrete hidden Markov models form the core part of the gesture recognition apparatus. Five test gestures have been defined and used to evaluate the performance of the application. The evaluation shows that with 10 training repetitions, an average recognition rate of over 90 percent can be achieved.”


Marco Klingmann is now an interaction designer and app developer working in Switzerland. Follow him on Twitter

The Secrets of the Antikythera Mechanism

 

aMore than a hundred years ago an extraordinary mechanism was found by sponge divers at the bottom of the sea near the island of Antikythera. It astonished the whole international community of experts on the ancient world. Was it an astrolabe? Was it an orrery or an astronomical clock? Or something else?

Research over the last half century has begun to reveal its secrets. The machine dates from around the end of the 2nd century B.C. and is the most sophisticated mechanism known from the ancient world. Nothing as complex is known for the next thousand years. The Antikythera Mechanism is now understood to be dedicated to astronomical phenomena and operates as a complex mechanical computer which tracks the cycles of the Solar System.

1412889651964_wps_21_WHOI_Diving_Safety_Office

Renewed interest in the ‘Antikythera Mechanism’ aka the worlds ‘first computer’ has been abound in the world’s media due to the recent return of archaeologists to the Antikythera dive site where it was originally found.

We are very lucky here at Goldsmiths to have Prof Xenophon Moussas from the University of Athens give a talk and demonstration about the workings of the ‘Mechanism’ right here at Goldsmiths on Tuesday.

EVENT INFO:

‘The Secrets of the Antikythera Mechanism’, Prof Xenophon Moussas, University of Athens

Tuesday, October 21st 2014
Ben Pimlott Lecture Theatre Goldsmiths, University of London
Talk: 5:30 – 6:30pm
Exhibition open: 5:00 – 7:00pm
Drinks and exhibition viewing: 6:00 – 7:00pm

Eventbrite - The secrets of the Antikythera Mechanism: Prof Xenophon Moussas, University of Athens

Andy Lomas and Patrick Tresset award winners @ The Lumen Prize!

Cellular Forms ~ Andy Lomas

Andy Lomas, Head of Computer Graphics at Framestore is the winner of Lumen Prize Gold for ‘Cellular Forms’. Andy regularly gives lectures and seminars at Goldsmiths and will be included in the ‘Creative Machine’ exhibition opening on 6th November 2014.

Patrick Tresset a visiting research fellow at Goldsmiths also obtained 3rd prize with his project ‘5 Robots Named Paul’.

Throwback Thursday: Sensory Response Systems

Back in 2009, Ryan Jordan (MFA in Computational Studio Arts) created Sensory Response Systems, an exploration into audio-visual performance using an array of sensors and controllers responsive to physical movements.

The project also explored the reshaping and replication of the body through the use of fabrics, textiles and technologies in order for the performer to fully embody and ‘become’ the instrument.

The overall aim was to bring a more direct and immediate relationship and control over the sound and images being generated, and to allow for full body expression and intimacy between performer and instrument (computer).


Ryan Jordan runs the noise research laboratory and live performance platform NOISE=NOISE

“Embedded in wires and circuits, Ryan Jordan beams throbbing, ritualistic
recreations of rave musik from some dystopic future place where all recording
technology is long since gone and only folk memories of ‘dance music’ exist.”
Dr Adam Parkinson, EAVI

Throwback Thursday: Scan_Memories

ini

We’re going all the way back to 2009 for this week’s Throwback Thursday look at past projects developed at Goldsmiths Computing. 

The Scan_Memories project investigated how new technology can create or participate in the process of reconstructing memories in comparison with the existing way of remembering the deceased and being remembered by the bereaved.

Developed at Goldsmiths by Miguel Andres-Clavera and Inyong Cho, the project used radio-frequency ID, mobile and multimedia technologies to give people a gate to keep an emotional relation with the deceased person.

Clavera and Cho said: “The project opens a heterogeneous and direct access to the memories materialized in physical spaces and in objects connected with the dead, presenting the dialectic between constructed formations based on presence and absence, and memory reconstruction through patterns technology mediated.”

Watch the 20 minute documentary

 

Throwback Thursday: LumiSonic

This week we revisit a research project developed by Goldsmiths’ Mick Grierson in collaboration with Sound and Music, Whitefield Schools and Centre and the London Philharmonic Orchestra.

LumiSonic is a sound application that visualises sound in real-time in a way that allows hearing-impaired individuals to interact with a specifically designed, graphical representation of sound.

Inspired equally by experimental film tradition and neuroscientific research, Lumisonic aims to help those with hearing difficulties have a better understanding of sound.

The Lumisonic iPhone app, created by Goldsmiths Creative Computing and Strangeloop, is a “proof of concept” visualisation tool that generates a filtered visual display based on live sound. Sound is transformed in real-time through the implementation of a Fast Fourier Transform (FFT) and translated into a moving image. The aesthetic and conceptual approach was informed by research in both visual perception and experimental film. Concentric ring formations are a common feature in both visual hallucinations, and experimental animation. Research suggests this is due to the structure of the visual system. Therefore this method of representing sound could be more effective than linear ‘graphic’ approaches.

Throwback Thursday: iMC Rap Maker

Back in 2009, BSc Creative Computing Student Eric Brotto released his final year project as on the iPhone app store. The app uses real time audio analysis and manipulation technology on a mobile phone to create a rap soundtrack from a users voice.

From the app store site: “Now you can be a rapper! iMC Rap Maker you talking and then transforms it into a rap song. Syncing your voice to the beat and even adding DJ scratches to your vocal means you get a Hip Hop hit produced by you. You can then share your creation with your friends via Facebook and SoundCloud. Download your iMC today!”


Eric Brotto is now mentor-in-residence at Startup Reykjavik Accelerator Programme, co-founder of Creative Bytes Worldwide, and content creator for DECODED.