V Motion project



Damn that's polished. Wonder how many people were involved in creating it. And with a first era kinect it doesn't seem to have any latency…

The depth sensors are getting amazing. The new kinect 2 tracks 6 people at a time and much quicker than the older version and prime sense sensors.

I've been working on converting the depth sensor data off the new kinect 2 into point clouds to make cool visualizations to use with the new oculus rift 2 dev kit. Rather than just controlling a character.

[ame="www.youtube.com/watch?v=LZgnVc4CqRg"]Kinectv2Concepts - YouTube[/ame]
 
It's hard to tell. Is this a choreographed thing where he practiced every single thing you see. Or is he doing a more freestyle approach. Or does he have no input at all and this is simply a program running and he is there for show, but if he messed up the program would keep running, sort of like lipsyncing.
 
It's hard to tell. Is this a choreographed thing where he practiced every single thing you see. Or is he doing a more freestyle approach. Or does he have no input at all and this is simply a program running and he is there for show, but if he messed up the program would keep running, sort of like lipsyncing.

It would appear that they're just interfacing Ableton live with their own interface, produced a tune that can be roughly 'jammed along to' and used the Kinects as a controller (with some s/ware of their own).