This week has seen the final demonstration of research developed over three and a half years of the FascinatE EU FP7 research project. The project has developed a complete end-to-end future broadcast system which combines ultra high definition panoramic video, 3D ambisonic and object based audio, new methods for delivery of interactive AV content and new interfaces and methods to interact with the AV media at the user end. It’s been my pleasure to lead University of Salford’s part of the project and this week, to host the final demonstration of the project.
We hosted the final demonstration event at our MediaCityUK building – it’s one of the few places that could actually support what we were trying to do, the infrastructure of the building was actually designed for this kind of thing but we pushed it pretty hard this week. FascinatE partners worked through the nights and our fantastic MediaCityUK tech team pulled out all the stops to make it happen.
In the Digital Performance Lab we had a performance of a Steve Davismoon composition, deeper than all roses. The music was performed by the band bears? bears! (some of our very talented music students here at Salford) and dance performed by Joseph Lau and Shona Roberts. Joe actually directed and choreographed the whole performance and designed the set too (thank you Joe!).
The performance was captured as an ultra high definition 180 degree panorama using six 2K camera feeds stitched together in real time to create a giant panoramic video feed. The cameras are part of the latest OmniCam from Fraunhoffer HHI, one of the FascinatE project partners, it actually has 10 cameras and can capture 360 degree panoramas. This was combined with footage from a broadcast camera from the BBC (who are on the project too) and sent to a content analysis engine for video analysis.
For audio we had a 32 capsule Eigenmike and a bunch of other close mics capturing sound in 3D as a higher order ambisonic sound field and loads of separate audio objects with coordinate locations.
All of this AV data was streamed in real time to a range of displays from TV sets, large projectors, a Christie tile wall in our building foyer and to iOS and Android mobile devices. People at the demo could freely pan and zoom around the panorama, controlling their own virtual camera by swipes on tablets or by intuitive hand gestures using Kinect for larger displays.
If you zoomed into the scene you could hear the audio change appropriately, zoom into the guitarist and his amp became clearer and louder and the singer moved off to the side or behind you in the mix to match her position on the screen.
For a more ‘lean back’ experience visitors could hand over control to the FascinatE virtual director which used content analysis of the video to determine regions of interest and framed the video and made edit decisions for you.
The project has been running almost three and a half years and has been a real pleasure to work on. Project partners really are the best in Europe and I have a terrific team here at Salford who have kept our work successful and on schedule. The opportunity to host the final demonstration at MediaCityUK was both scary and exciting in equal measure. The most difficult but also the most exciting event I’ve been responsible for since we got here and certainly the one that has pushed the building infrastructure hardest.
UPDATE: Some more coverage of the event…
Here are a few pictures from the day as well.