Signing avatar animation pipeline - creating an interactive Auslan assistant
Voice-based AI assistants like Siri, Alexa, and Google Home are not accessible to Deaf people. A team of researchers from the University of Queensland is working in collaboration with the Queensland Deaf Community to develop a personal assistive device that will recognise Auslan, Australian Sign Language, and generate signed responses via an avatar. The presentation details the process of designing a human-like signing avatar and developing an animation pipeline based on motion capture and game engine technology.
Maria Zelenskaya is a digital designer, animator, researcher, and teacher with a great passion for interactive and immersive technology. Maria specialises in motion capture and real-time character animation. As part of her doctorate research, Maria designs interactive virtual avatars and examines the benefits of collaborative community-based design.