Finally had time to play around with the next Kinect. It is much better than the original Kinect in many ways. Looks like integration with Unity is not going to be trivial as K4W is 64 bits whereas Unity runs in 32 bits… Oh well..
News for 2013
Testing the Kinect 2 / K4W
Diving a bit deeper into PixelConduit, a realtime graphic composition tool that is proving invaluable to the project. It handles capture from Blackmagic capture devices, chroma keying of incoming video, and streaming the result into a Unity texture using Syphon.
This week I was able to make a much nicer looking key than before, with more controls. Looking into controlling this with a MIDI controller with motorised faders.
quick and dirty demonstration of the state of things so far:
Unity3D camera being controlled by Previzion, respecting position, orientation, zoom and focus. Live camera feed is chroma keyed using PixelConduit, streamed via Syphon to Unity3D where it is composited with the virtual environment.
Unity 4.3 released today
We are now the proud owner of a 12 camera motion capture setup from OptiTrack that we will use to control virtual characters in real time.
Winners Announced for the 65th Primetime Emmy Engineering Awards
Enjoyed a 3 day guest lecture at INSEEC London where students got a crash course in Unity, designed their own level and on the last day got to explore it on the Oculus Rift.
Progress : realtime chroma keying of video in Pixelconduit, streaming to Unity via Syphon
Implemented & tested successfully:
-streaming external video into 3D environment
Implemented, not tested:
Still to implement
-chroma keying of incoming camera footage
The story so far
Since five years the Netherlands Film Academy uses the real time game engine Unity3D for all interactive courses on the Interactive Media & Visual Effects department. Unity has a relatively low learning curve and allows our students to make visually outstanding interactive experiences.
A couple of years ago our department acquired a Lightcraft Previzion virtual camera system that can show real actors in a virtual environment in camera in real time. This technique, called Virtual Production, has created a revolution in the film industry.
Previzion uses a proprietary software to pull this off and this has some disadvantages. First of all we have found the workflow importing assets from Maya sub optimal and, more importantly, the closed nature of the software does not allow us to add any functionality.
Luckily Previzion has a protocol to send it’s information to 3rd party application so I started researching if we could make a connection with Unity. Early 2013 we successfully tested an early prototype where most of the information from Previzion was visualised in Unity in real time.
After this successful test we realised the combination of Previzion hardware and Unity software has more potential uses than just previsualisation. Combined with 3D sensors such as Kinect, PrimeSense and Leap Motion it should be possible for an actor to actually interact with a virtual environment.
This blog will document our research of realtime technology in cinema production as well as provide links to new developments that are relevant to our research