{"id":1499,"date":"2015-02-17T16:21:39","date_gmt":"2015-02-17T16:21:39","guid":{"rendered":"http:\/\/www.doc.gold.ac.uk\/blog\/?p=1499"},"modified":"2015-02-17T16:21:39","modified_gmt":"2015-02-17T16:21:39","slug":"major-funding-for-next-generation-tech-that-adapts-to-human-expression","status":"publish","type":"post","link":"http:\/\/www.doc.gold.ac.uk\/blog\/?p=1499","title":{"rendered":"Major funding for next-generation tech that adapts to human expression"},"content":{"rendered":"<p><strong><img src=\"http:\/\/www.gold.ac.uk\/news\/homepage-news\/wearabletechLarge.jpg\" alt=\"\" \/><\/strong><\/p>\n<p><strong>Computer scientists at Goldsmiths, University of London have been awarded more than \u00a31.6m to lead an international team in accelerating the development of advanced gaming and music technology that adapts to human body language, expression and feelings.<\/strong><\/p>\n<p>The success of first generation interfaces that capture body movement, such as the Nintendo Wii and Microsoft Kinect, has demonstrated a public appetite for technology that allows users to interact with creative multimedia systems in seamless ways.<\/p>\n<p>The Rapid Mix consortium will now use years of research to develop advanced gaming, music and e-health technology that overcomes user frustrations, meets next generation expectations, and allows start-ups to compete with developments from\u00a0major corporations, such as\u00a0Apple, Google and Intel.<\/p>\n<p>Rapid Mix will bring cutting-edge knowledge from three leading technology labs to a group of five creative industry SMEs, based in Spain, Portugal, France and the UK, who will use the research to develop prototype products.<\/p>\n<p>Newly developed Application Programming Interfaces (the tools that allow software to interact with another programme) and new hardware designs will also be made available to the Do-It-Yourself community through the open access platform.<\/p>\n<p>Rapid Mix is led by <span style=\"text-decoration: underline;\"><a href=\"http:\/\/www.gold.ac.uk\/computing\/people\/academic\/tanakaatau\/\">Professor Atau Tanaka<\/a><\/span> from the <a href=\"http:\/\/www.gold.ac.uk\/computing\/\">Department of Computing<\/a> at Goldsmiths, University of London, with <span style=\"text-decoration: underline;\"><a href=\"http:\/\/www.gold.ac.uk\/computing\/people\/academic\/fiebrinkrebecca\/\">Dr Rebecca Fiebrink<\/a><\/span> and\u00a0<span style=\"text-decoration: underline;\"><a href=\"http:\/\/www.gold.ac.uk\/computing\/staff\/m-grierson\/\">Dr Mick Grierson<\/a><\/span>.<\/p>\n<p>Professor Tanaka comments: \u201cHumans are highly expressive beings. We communicate verbally but the body is also a major outlet for both conscious and unconscious expression. In this quest for expression we\u2019ve created art, music and technology.<\/p>\n<p><strong>&#8220;Technological advances have their greatest impact when they enable us to express ourselves, so it logically follows that new, disruptive innovations need interfaces that take advantage of our expressivity, rather than acting to restrict it&#8221;.<\/strong><\/p>\n<p>\u201cMicrosoft has promised a Kinect 2 that detects heart rate to assess gamers\u2019 responses, but small European businesses struggle to compete with the corporations when it comes to getting amazing products from the lab into the public\u2019s hands. Our project aims to overcome this challenge and get new technology directly to users, where it will have true impact.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Computer scientists at Goldsmiths, University of London have been awarded more than \u00a31.6m to lead an international team in accelerating the development of advanced gaming and music technology that adapts to human body language, expression and feelings. The success of first generation interfaces that capture body movement, such as the Nintendo Wii and Microsoft Kinect, &hellip; <a href=\"http:\/\/www.doc.gold.ac.uk\/blog\/?p=1499\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">Major funding for next-generation tech that adapts to human expression<\/span> <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":9,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"spay_email":""},"categories":[118,128,110,106,107],"tags":[],"jetpack_featured_media_url":"","_links":{"self":[{"href":"http:\/\/www.doc.gold.ac.uk\/blog\/index.php?rest_route=\/wp\/v2\/posts\/1499"}],"collection":[{"href":"http:\/\/www.doc.gold.ac.uk\/blog\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.doc.gold.ac.uk\/blog\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.doc.gold.ac.uk\/blog\/index.php?rest_route=\/wp\/v2\/users\/9"}],"replies":[{"embeddable":true,"href":"http:\/\/www.doc.gold.ac.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=1499"}],"version-history":[{"count":1,"href":"http:\/\/www.doc.gold.ac.uk\/blog\/index.php?rest_route=\/wp\/v2\/posts\/1499\/revisions"}],"predecessor-version":[{"id":1500,"href":"http:\/\/www.doc.gold.ac.uk\/blog\/index.php?rest_route=\/wp\/v2\/posts\/1499\/revisions\/1500"}],"wp:attachment":[{"href":"http:\/\/www.doc.gold.ac.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=1499"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.doc.gold.ac.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=1499"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.doc.gold.ac.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=1499"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}