On Jan 24th – 26th, Goldsmiths Department of Computing, ran a Virtual Reality Hackathon weekend with HTC Vive. Participants were challenged to use VIVE hardware, including Software development kits like eye, lip and hand tracking to create an innovative virtual reality project. Meet some of the Goldsmiths student teams who wowed the judges with their unqiue and brilliant projects.
Group 1: Active Listening Training in VR
(aka the winners of the HTC Vive Hackathon 2020)
“Our team had a very strong technical strength, with extensive knowledge in machine learning, VR and Unity development”Carlos Gonzalez Diaz
What was the biggest challenge?
For this group the biggest challenge was the use of eye tracking, as well as the additional challenge of adding experimental sensors together. This team experimented with movement, mouth, eye, fingers and EEG (brain electrical activity) trackers. Unfortunately EEG and finger trackers proved too difficult within the strict time restrictions so they dropped them. The team managed to successfully integrate movement, eye and mouth tracking into a machine learning model in the final prototype.
What made the project unique?
The combination of technologies the team used, paired with an interesting story line made the project stand out. The team used the InteractML machine learning tool, an interactive machine learning framework for Unity which was developed by Carlos and colleagues. The machine learning aspect eased the teams workload.
What did they learn?
Cristina Dobre said “I’ve learnt many things from taking part in this event but if I’d choose one, that would be integration-as-you-go. As the team members specialised in different areas and worked in parallel on various parts of the system, we managed to put everything together towards the very end of the event. This gave us only little time to test and fix integration bugs which made the final work very stressful (also given our sleep-deprived states). We managed to have a playable demo with most of the important parts working, but it would have been a much smoother process if the integration would have taken place throughout the development, even though each part might have been only partly finished “
Team members (from left to right in tweet above):
Cristina Dobre, PhD Human Centered AI Characters in VR,
Lili Eva Bartha, experienced Designer and Scientist,
Claire Wu, PhD Neuroscience,
Carlos Gonzalez Diaz, PhD Machine Learning for expressive interactions
Group 2: VR Illusion
This team was a group of Goldsmiths students, many of whom has just started to learn VR in October 2019.
What were the teams strengths?
The skills in the team were varied, Hankun’s knowledge of unity helped them to solve their biggest problem of using C# to set the relationship between the size and position of the object. Yaqi brought skills in 3D modelling, so could quickly create the models they needed. Chaojing is skilled with the storytelling and drawing, so could set the story of the game and draw assets they needed. Finally, Shuai Xu is experienced in user interface design and sourced the music for the project.
How did the project relate studies at Goldsmiths?
Chaojing Li said “For the production of virtual reality games, the sense of the presence of the player is essential, because I think the most important meaning of Virtual Reality is to give people an immersive experience. We think that if there is no such sense of presence, then VR games are no different from games on ordinary platforms. During last semester, in the “3D Virtual Environments and Animation” class, our teacher Xueni Pan and Marco Gillies explained a lot of theory about Virtual Reality and some related psychological knowledge. This gave us a preliminary understanding of how to create a sense of presence in the virtual environment.”
What was the project?
The group focused on virtual reality object interaction and eye tracking technology and how to combine the two to work together. In their programme cubes are thrown onto a mechanical belt like you would see in a factory, the user must stack the cubes onto each other, the challenge is that when the user looks directly at the area where the cubes are their vision is blocked, so they must use their peripheral vision to complete the task.
What is unique about your project
Nima Jamalian said “for our project we reversed the use of eye tracking technology. In majority of application that uses eye tracking the focus is on where user eyes are looking at however in our application we reversed it, the progamme checks if the player is not looking and only then the user can perform the task – so we sort of track where the player is not looking.”