http://www.i-programmer.info/news/144-graphics-and-games/1932-a-kinect-princess-leia-hologram-in-realtime.html
http://web.mit.edu/newsoffice/2011/video-holography-0124.html
http://www.youtube.com/watch?v=4LW8wgmfpTE&feature=player_embedded
True 3D realtime holography is not only possible — it makes use of a Kinect as its input device. A team at MIT has recreated the famous 3D Princess Leia scene from the original Star Wars — but as a live video feed!
Michael Bove's group at the MIT Media Lab has managed to create real 3D holograms in real time and transmitted a reenactment of the Princes Leia scene in the original Star Wars. What is important to notice is that the resolution and frame rates may be low but this is live 3D - not stored (as the hologram in R2D2 was in the movie).
It's a great stunt but don't miss the importance — this is realtime 3D holography and that means you can view it without any glasses or other gadgets and you can move around and see behind objects in the scene. This is more than the flat 3D you get in movies.
The idea being developed here is very simple to understand but very difficult to implement. A hologram creates a true 3D (no glasses or any other trickery needed) by recreating the wavefront of light that the real 3D scene would have created.
When you view a hologram you really are interacting with the light field that the original object would have created. The big problem is that creating the wavefront involves taking a parallel beam of light and passing it through an interference pattern that transforms it into the desired wave front. The standard way of doing this is to record an interference pattern onto a photographic plate. The interference pattern is obtained from the original 3D scene so what you have is essentially a 3D camera.
Unfortunately you can't easily turn this into a broadcast system because the resolution needed to record the interference patterns is too great, as is the exposure time.
A better solution would be to create the interference patterns by computing them and then displaying them on a screen. The problem is that you have to invent a whole new type of screen with a very high resolution and an ability to change the "phase" of the light at each point of the screen. The display being used at MIT was developed by students of Stephen Benton, a pioneer of holographic imaging who died in 2003.
In many ways it is this display which is the most important part of the system. The team are working on something better, simpler and hopefully cheaper.
Where does the Kinect fit in?
The Kinect simply acts as a cheap, off-the-shelf, 3D camera. It works out the location of each pixel in 3D and using this information and the color the computers can work out the hologram - in real time. The Kinect data is fed to a laptop which sends it to a PC with three GPU based graphics cards which then compute the interference patterns needed to create the wavefront. At the moment the computation only results in 15 frames per second but with more work they expect to get up to standard frame rates.