This story is from the category Embodiment
Date posted: 03/08/2012
Computer-generated characters have become so lifelike in appearance and movement that the line separating reality is almost imperceptible at times. "The Matrix" sequels messed with audiences' perceptions of reality (in more ways than one) with action scenes mixing CG characters and real actors. Almost a decade later, superheroes and alien warriors dominate the multiplex. But while bipeds and quadrupeds have reigned supreme in CG animation, attempts to create and control their skeleton-free cousins using similar techniques has proved time-consuming and laborious.
Georgia Tech researchers have found a possible solution to this challenge by developing a way to simulate and control movement of computer-generated characters without a skeletal structure, anything from starfish and earthworms to an elephant's trunk or the human tongue.
Their modeling techniques have the potential to allow amateur animators and even young children unparalleled control of digital creatures by simply pointing and clicking on a screen to have them move the way they want. One can imagine aspiring animators with tools to build a more boisterous Bob -- "Monsters vs. Aliens'" resident blob -- or an updated Ursula from "The Little Mermaid," with her sinister tentacles used to full effect with computer graphics.
The researchers' work targets simulation and control of soft body locomotion -- movement of characters without a skeletal structure -- something that is rarely explored in animation, according to Karen Liu, one of the researchers and associate professor in the School of Interactive Computing at Georgia Tech.
Eschewing the traditional use of skeletons with moving joints as the basis for animation control, the Georgia Tech research simulates soft body computer models and controls their movement in completely new ways. Liu and fellow researchers Jie Tan and Greg Turk will present their research paper "Soft Body Locomotion" at SIGGRAPH 2012, the ACM international conference on computer graphics and interactive techniques, in Los Angeles, Aug. 5-9.
The computer models used in the research -- jello-like alphabet letters -- mimicked nature's soft body organisms and were created using "muscle fibers to control a volume-preserving finite element mesh." In short, just as a hacky sack or bean bag maintain their mass no matter how they are squashed, the computer models followed the same principle.
The soft body ABCs were able to perform a wide array of motions that users decided with simple point-and-click commands. The researchers developed algorithms that allowed "high-level goals," which refer to specific movements, like walking from one point to another, or jumping and then regaining balance. Prior to this technique, in order to get soft-body characters to perform some meaningful movement, animators might attempt thousands of computer simulation trials to get the soft body even close to a functional motion, Liu says.
"In this project we 'physically simulated' or created lifelike movements in the soft body models that don't require much user intervention. We've built a framework where the user or the animator can just click on a point of the soft body and direct the type of movement he or she wants."
See the full Story via external site: www.sciencedaily.com
Most recent stories in this category (Embodiment):
28/02/2017: UK robotics research gets Â£17.3m pledge