FascinatE Project ReviewMay 19, 2011
One of the research projects I’m working on is a large scale EU funded research project called FascinatE: ‘Format-Agnostic SCript-based INterAcTive Experience‘.
The project has been up and running since last February and is developing a complete future end-to-end broadcast system for covering live events. We are capturing 180 degree hi-res panoramic video and stitching it together with clusters of HD cameras so that the whole panorama can be navigated by script based, and possibly free viewpoint control from the viewer. Though is viewer still the right word? I think more of an active participant once you start creating your own scene, and your own narrative through an event. We’ve been talking in terms of ‘lean forward’ and ‘lean back’ approaches so there’s a default set of production for your personal tastes/demographic/preferences in ‘lean back’ mode and options for wider control in ‘lean forward’ mode.
I just got back from the first annual review of the project by the European Commission, all went pretty well regarding the review. The best bit about the review process for me was that all 11 partners put together demos of the work that they have been doing over the last year or so for the reviewers and there’s some great work been done already. Here’s a brief flavour of some of it thanks to Andrew Gibb of BBC Research who made notes on the way through and was kind enough to pass them on for me to edit (Thanks Andy).
Technicolor & the Technical University of Catalunya demonstrated Real-time interactive exploration of a 7k by 2x video panorama, with pan, tilt and zoom controlled by gestures or keyboard. They’ve recently switched to using Kinect for their motion tracking and it works very nicely indeed.
Technicolor demonstrated 4th order ambisonic presentation of audio from Chelsea Vs Wolves recorded with a 32-capsule eigenmike. This sounded FANTASTIC, you would not believe the way the crowd reaction washes around the stadium like an acoustic Mexican wave. Also made it clear to me that reproducing height information for audio makes a massive difference to immersion.
BBC showed work towards rendering together high and low resolution video from different cameras.
Alcatel-Lucent showed us delivery of panoramic video dynamically resized over a limited-bandwidth network.
We got a sneak preview of a split head version of Arri’s Alexa camera and…
Rob and I from University of Salford demonstrated real-time audio object detection and extraction, detecting and tracking a football being kicked along with whistle blows etc. for reproduction in ambisonics and wavefield synthesis systems.
Although the EC review is a fairly onerous and stressful event it was hugely beneficial to see at first hand what the various partners have been doing toward the project. It’s hard to imagine putting together that scale of demos without the driver of review and the reviewers, as well as asking difficult questions, were genuinely interested in what we are doing.
One or two of you may have seen some presentations and read papers we put together for the 130th AES Convention in London recently. Rob Oldfield presented Salford’s work on audio object extraction and we had a poster about the complete FascinatE audio system, from acquisition to reproduction, with Technicolor colleagues. We also have papers accepted and demos running at IBC in Amsterdam in September so if you’d like to see some of what we’ve been doing plese get along to our booth and presentations at IBC and get a taste of what broadcast might be like in a few years time.Follow @BenGShirley