Hcc-GV: Small: Generating Animal Avatar Animation With Specific Identifiable Traits Based Upon Viewer Perception of Real Animals
- View All
Computing devices have transformed person-to-person communication, dramatically altering how we present ourselves to the world. As the use of computer character animation rapidly expands into 3D digital media in mobile devices, social networking, and massively multiplayer online games, the demand for distinctive personalized digital avatars is beyond the capacity of key-frame animation, beyond the range of motion capture, and beyond the expressive capability of rule-based animation. The PI''s goal in this project is to define a new framework that transforms biological locomotion into a content form, classified by identity-rich features which can be synthesized for use in animating animals, whether real or imagined. The research focuses on how species, age, and weight are perceived by people viewing real animals, with the intention of making virtual animals, or humans representing themselves as animal avatars, more expressive. Project outcomes will include a technique for procedural generation of animation for novel digital creatures that is capable of movement patterns signifying a specific animal species, age as young or old, and weight as heavy or light. The PI argues that to develop this new way of creating and managing expressive animation, we need a better understanding of how we perceive motion itself. To this end, the PI and his team will borrow from biological motion studies to determine what level of detail of motion information is required for recognition. He will use eye tracking to determine where the information is found in animal motion, and will employ linear analysis to decompose the motion and re-inject the identity-laden elements into novel animal forms. Thus, the work will combine three areas of research - perception of biological motion, eye tracking, and synthesis of gait patterns - to develop the foundation for a generative system for animal avatar animation. Broader Impacts: This project combines the use of computational systems to both analyze how we perceive motion and to synthesize new, novel motion. The research will contribute to our understanding of human perception of biological motion, and expand the performance range and emotional impact of synthesized animation. Techniques for isolating the spatio-temporal information that leads to recognition of identifiable traits will expand beyond the perception of humans and into the domain of animal motion, thereby opening the door to creating animal avatars that are expressive through motion in a variety of ways that communicate personality. Project outcomes will thus be of interest to creators of digital content for games, immersive virtual worlds, and cinema. The PI expects this work will lead to expansion of both the range of traits that can be procedurally applied to animation and the sophistication of novel gait generation methods.