After six months of computation, researchers at Carnegie Mellon University and the University of California, Berkeley, say they have simulated almost every way in which a piece of cloth might shift, fold and drape over a moving human figure.
'I believe our approach generates the most beautiful and realistic cloth of any real-time technique,' said Adrien Treuille, associate professor of computer science and robotics at Carnegie Mellon.
To create this cloth database, the team took advantage of cloud computing power, ultimately using 4,554 central processing unit hours to generate 33 gigabytes of data. Treuille said this presents a new paradigm for computer graphics, in which it will be possible to provide real-time simulation for virtually any complex phenomenon, whether it’s a naturally flowing robe or a team of galloping horses.
To explore this highly complex system, researchers developed an iterative technique that continuously samples the cloth motions, automatically detecting areas where data is lacking or where errors occur.
For instance, in the study simulations, a human figure wore the cloth as a hooded robe; after some gyrations that caused the hood to fall down, the animation would show the hood popping back onto the figure’s head for no apparent reason. The team’s algorithm automatically identified the error and explored the dynamics of the system until it was eliminated.
'The criticism of data-driven techniques has always been that you can’t pre-compute everything,' Treuille added. 'Well, that may have been true 10 years ago, but that’s not the way the world is anymore.'