This video shows a new approach to designing video game characters that can respond to our body movements and body language. Rather than trying to program explicit rules for behavior, which would make it hard to capture the subtleties of body language, our software allows people to design movements directly by moving and interacting.
Two people can play the roles of the video game character and the player, showing how the character should respond by acting out the movements themselves. This allows them to design movements in a natural way, by moving, rather than having to think about mathematical rules. The motion of both participants are recorded and synchronized. This data is then used as input to a machine learning algorithm which learns an algorithm for automatically controlling a video game character so that is responds in the same way as the people designing it.
This style of design is particularly well suited to actors and performers who have a deep understanding of movement and body language. We did an in depth case study with physical theatre performer Emanuele Nargi, who used our software to design an interactive character based on his interactions with a number of members of the public.
Thank you to our colleagues at Goldsmiths, University of London, Emanuele Nargi of the MA Performance Making, the Embodied Audio-Visual Interaction research group at Goldsmiths Digital Studio, the Department of Computing and to Anna Furse of the Department of Theatre and Performance.
Filmed and edited by Cristina Picchi