Projects

I am a member of the Embodied Audio-Visual Interaction group

My main research interests are on animated virtual characters and particularly expressive body language. This broad area covers many aspects, including animation, AI and the simulation of behaviour, motion capture and the analysis of body movement. My most recent work has centred on Data Driven Methods for creating responsive virtual characters.

Funded Projects

Better Than Life

Better Than Life is a collaborative project with Coney and Showcaster funding by the Digital R&D Fund for the Arts to develop a set of tools to give audiences at home a new way to engage with live performance- offering engagement between the performance, an audience in the space and an audience online.

The project aims to develop a new performance with interactive potential at its heart. The piece, provisionally called Better Than Life, will be a theatrical experiment blending traditional ‘live’ magic, suggestion and story-telling with a technologically enhanced world around the audience.

PRAISE

http://www.iiia.csic.es/praise/

PRAISE is an EU FP7 funded project that aims to use community building software combined with content based analysis to enhance social music learning. The technology platform will enable both automated and community based feedback on musical performance. My part of the project will primarily be the analysis of posture during performance and of musical gestures.

Performance Driven Expressive Virtual Characters

Creating believable, expressive interactive characters is one of the great, and largely unsolved, technical challenges of interactive media. Human-like characters appear throughout interactive media, virtual worlds and games and are vital to the social and narrative aspects of these media, but they rarely have the psychological depth of expression found in other media. This proposal is for the development of research into a new approach to creating interactive characters which identifies the central problem of current methods as being the fact that creating the interactive behaviour, or Artificial Intelligence (AI), of a character is still primarily a programming task, and therefore in the hands of people with a technical rather than an artistic training. Our hypothesis is that the actors’ artistic understanding of human behaviour will bring an individuality, subtlety and nuance to the character that it would be difficult to create in hand authored models. This will help interactive media represent more nuanced social interaction, thus broadening their range of application.
The proposed research will use information from an actor’s performance to determine the parameters of a character’s behaviour software. We will use Motion Capture to record an actor interacting with another person. The recorded data will be used as input to a machine learning algorithm that will infer the parameters of a behavioural control model for the character. This model will then be used to control a real time animated character in interaction with a person. The interaction will be a full body interaction involving motion tracking of posture and/or gestures, and voice input.

This project is funded by the EPSRC under the first grant theme (project number EP/H02977X/1).

Gillies, Marco, 2009. Learning Finite State Machine Controllers from Motion Capture Data. IEEE transactions on computational intelligence and AI in games, 1 (1). pp. 63-72. ISSN 1943-068X

Gillies, Marco and Pan, Xueni and Slater, Mel and Shawe-Taylor, John, 2008. Responsive Listening Behavior. Computer Animation and Virtual Worlds, 19 (5). pp. 579-589. ISSN 1546-4261

Interactive and Virtual Performance

Screen shot 2010-08-25 at 15.33.03

This project focuses on performance (drama, dance and performer-driven work), an art form that is not often considered in the design of digital technology. The work will be interactive, in the sense that performer and/or audience interact with a digital display, and virtual in the sense that the performance is mapped onto a digital virtual avatar. We will focus on two aspects of this kind of work that have been enabled by recent technological advances, but with which are still challenging for artists to implement: motion based interaction and full size stereoscopic (3D) virtual humans.

The ability to interact solely with natural body movements adds considerably to the sense of immersion and the ease of interaction. Using movement-tracking technologies enables artists to create compelling new styles of interaction, this is particularly true of performers who make extensive use of their own bodies in performance. However, the technology is still difficult. Low cost motion capture systems (such as the one in the Goldsmiths Digital Studio) make this task easier, but recognizing particular movements relies on complex statistical pattern recognition algorithms. We are currently developing a performer centred software platform to movement recognition. The first aim of the project will be to investigate how performers use this technology.

A second theme of the project will be movement-based interaction, specifically with a virtual performer (whose behaviour will be based on motion data captured from a real performer). In order for this interaction to be equivalent, the virtual performer must appear to share the same space as the real participant, so spatial movements can have the same referent.

Digital Reconstruction in Archaeology and Contemporary Performance

This AHRC/BT Research Network explores the use of digital technology for documenting sites and events in archaeology and contemporary art and performance. It investigates how these disciplines can influence each other’s use of technology and how new technologies can be applied.

The aim of the network is to generate new ideas for collaborative, interdisciplinary research around the research themes. The expected outcomes are a number of concrete grant proposals for which the participants will seek research
funding.

The project will consist of interdisciplinary workshops including academics, technologists and working arts practitioners (who will act as end users).

Network website

[intlink id=”32″ type=”page”]Older Projects[/intlink]

Leave a Reply

Humans and Machines/Learning and Teaching