Prestigious academic journal Nature reports that applying a 9v battery charge to your scalp while videogaming can double the rate at which you learn how to play the game…
“Volunteers receiving 2 milliamps to the scalp (about one-five-hundredth the amount drawn by a 100-watt light bulb) showed twice as much improvement in the game after a short amount of training as those receiving one-twentieth the amount of current. “They learn more quickly but they don’t have a good intuitive or introspective sense about why,” says Clark.”
Nice. Will iClone 5 ship with a small “thinking cap” and a couple of spare batteries?
Variety covers a chat between James Cameron, George Lucas and Jeffrey Katzenberg…
As for animation, a sea change is coming, [Jeffrey] Katzenberg [CEO of Dreamworks] said, in the form of a process that will “fundamentally change the quality of what we do.” It’s called scalable multicore processing, essentially another exponent of computing power that will take the tedium of rendering out of the animation process.
“The power and the speed of the chips is about to take a quantum leap, the result of which is that our artists will be able to see their work in real time,” Katzenberg said.
As of now, animators make a couple of seconds of rough, low-resolution footage that’s sent to a rendering farm and returned as much as 12 hours later.
Not for much longer.
“It’s almost as if they were painting blind,” Katzenberg said. “What this next generation does is that the artist will see their work as they’re creating it. … I cannot tell you how transformative that will be in our storytelling.”
Have a Microsoft Kinect and a hankering to create iClone motion-capture files with it? iPiSoft currently has a 30-day trial demo of their iPi Studio, with functional Kinect -> iClone export. An iClone iPisoft tutorial for iClone export is available here. It seems fairly straightforward…
For those without a Kinect, there doesn’t yet appear to be an online archive of free Kinect mo-cap sessions that you could download and use with iPiSoft. In fact, I’m not sure it’s even possible to archive the raw data capture like that.
Talking of character animation, there’s a free script to automate breathing of a Poser character. It works by adjusting the dials on body parts, so there may be a possibility that the animation loop will then be able to be ported to iClone via an .fbx export of a character?