You've got to protect your writing time when you're starting out. Find out when you can write and protect that time. You have to protect it. If you don't protect it, nobody will. You can accomplish a lot in two hours.
Next month, London will host two key media industry conferences – the venerable London Book Fair and the second outing of Transmedia Next. Storytelling professionals happy to stay in the world of business-as-usual will be attending the London Book Fair. But those who have discovered that business-as-usual doesn’t cut it in the 21st century – who want to stay at the cutting edge of media production – those people will be hitting Transmedia Next.
Transmedia Next is a three-day series of seminars, workshops and exercises aimed at training storytelling professionals in the theory and practice of transmedia storytelling. It is hosted by Seize The Media, with the support of the EU MEDIA Programme. Lance Weiler, Seize The Media’s creative director and chief story architect, unnerved attendees of the Sundance Film Festival with the short film his short film “Pandemic 41.410806, -75.654259”. The film played in conjunction with a transmedia experience accessible to people on the streets of Park City too, and the Sundance crowd got a peek into Weiler’s compelling and intricate storyworld, “Pandemic 1.0” (www.hopeismissing.com).
Lance Weiler’s Pandemic 1.0 short film, shown at Sundance
I spoke with Anita Ondine, transmedia producer and CEO of Seize The Media about transmedia and Transmedia Next. Anita is passionate about educating creatives and producers in the method and vocabulary of transmedia production. She grew up in Australia surrounded by artists and creatives. Her later years took her to law school and then to a series of positions tackling legal issues of technology and intellectual property for major firms. She was a Senior VP at Lehman Brothers in London until 2006 when she decided to pursue filmmaking full time. For her, the transition from finance to film was perfectly natural. She has always been a storyteller, a communicator, and her practical experience in the no-nonsense arena of The City gave her the perfect toolkit to becoming a 21st century producer.
The term “transmedia” is thrown around with ever-increasing frequency, but surprisingly few people, even those in the media industries, have a solid grasp of what it exactly is. “Transmedia” is often confused the old-school term, “multi-media”. Multi-media is the presentation of a story in multiple formats – often repeating the same story in a book version, then a film version, then a game version, etc. Ondine explains that transmedia is a type of storytelling in which the story exists independently of the media used to present it. The story exists before and beyond its appearance in a specific form and each media experience is a limited window onto that larger story. “There are gaps in the storytelling,” Ondine says, “where the audience – or participants as I like to call them – fill in their own experience, through their own imaginations or by supplying content themselves or by actually physically taking part in the story.”
Anita Ondine, Transmedia Producer
Lance Weiler’s “Pandemic” short, which Ondine produced, is only one viewpoint into the Pandemic storyworld. An web of online and real-world content, carefully architected, allows participants to interact with the Pandemic 1.0 storyworld in a variety of ways. It is that careful structuring of the storyworld parameters – its characters, timeline, rules, narrative style – and the orchestrating of the venues by which participants can access it that makes transmedia such a challenging and exciting storytelling arena.
Developing a transmedia storyworld requires forethought and vision. The development and production of a computer game might be a comparable endeavour, but a highly complex transmedia story might have a computer game embedded in it as only one of the numerous experiences available to the participant. And how each of these different experiences interacts with each other and with the ever-evolving participant can be unpredictable. In a transmedia experience, the participants or audience might begin contributing more to the story, changing things in real time, introducing complications and story twists of their own. The story architects must be meticulous in their preparation of the underlying narrative and technological structures supporting the storyworld. Transmedia Next emphasises the preproduction of a transmedia story is as important as the storytelling itself. Though some of the well-tested workflows of 20th century media production still apply, new ways of building a story and offering it to an audience have had to be introduced, often through an R&D process that continues beyond deployment of the story. The world of transmedia storytelling is still in its infancy, a “Wild West” where methods and techniques are still being pioneered and experimentation is the name of the game.
Transmedia Next is a gathering of professionals who already have a solid grounding in their own creative arenas – design, writing, finance, production, and this is one of its features that most excites Anita Ondine. The conversation that develops among these gathered professionals can be as enlightening as the seminars themselves. Transmedia Next participants are reminded that they are as vital a part of the learning process as Ondine and the rest of the seminar leaders. Characteristic of a transmedia experience, attendees move out of the realm of passive observer to active participant, discovering insights and methods that a single artist might have never arrived at on his or her own.
Ondine is eager to help people discover how transmedia stories can both creatively financed and produce profits. Because transmedia has such a wide reach in terms of the demographic of its participants, as well as a variety of venues in which it might be encountered, it has a potential for many different kinds of revenue streams. Typical of the digital age, revenue generated by transmedia projects tends to be non-linear with multiple types of revenue potential, from the old media model of volume and unit selling to a whole salad of options including subscriptions, sponsorship, ad sales, and franchises. Ondine says, “Transmedia is about the experience. That’s what makes it unique. You’re not restricted to moving units. The income can come from selling experiences.” And certainly, there is no limit to what can be experienced. The transmedia income model calls for as much creative vision as the transmedia story architecture.
This year’s Transmedia Next will again feature Anita Ondine and Lance Weiler. Joining them again this year will be Inga von Staden, Berlin-based media architect, educator for 21st century media creatives. She has published and lectured widely on technology-enhanced media and brings an intellectual rigor and years of experience to the seminars. New on the Transmedia Next team this year is Jonathan Marshall, who has been a lead technical strategist for the BBC’s interactive TV initiatives and is CTO of Social Television at SlipStream. His work for the BBC also won him a BAFTA.
Transmedia Next takes place 12th – 14th April, 2011 in London. For more information go to TransmediaNext.com or email sam [at] transmedianext.com.
The term “transmedia” seems to have originated in 1991 with Marsha Kinder, the critical studies dynamo of USC’s cinema school. When I was at USC, cinema students were divided into two theoretical camps. You were either a Marsha Kinder devotee (European, experimental, theoretical cinema) or a Drew Casper devotee (classic American cinema with big movie stars). And I have to say it was Drew Casper for me at the time. But I was young and narrow-minded. Today it would be Marsha though, definitely.
MIT gave “transmedia” its seal of approval in 2003, when Henry Jenkins – also a USC media professor now – wrote his game-changing article “Transmedia Storytelling”. I’ve always thought that knowing the names of things is one of the differences between an amateur and a professional. But the terminology is still flux when it comes to 21st century media. We’re making this stuff up as we go, and it will take some time to get our toolboxes properly organised. In the present Tower Of Digi-Babel ruckus, some people say, “transmedia”, some “crossmedia”, some “media 360”. Me, I say “full spectrum media”. Richard Wagner said “Gesamtkunstwerk”. And people in 1999 said “new media”.
The key feature that distinguishes true transmedia from stories presented discretely in traditional formats is that in transmedia, the story exists before and beyond its appearance in a specific form. I’d like to think that the pen & paper role-playing games of yesteryear were one of the first truly transmedial entertainments, where characters, places, monsters and events are assumed to already exist and the stories experienced and told by players are spin-offs and riffs on the already existing world. Some of the major science fiction franchises too have offered up stories, characters and worlds that appear transmedially, as children of the original universe. But we are still in the Wild West phase of transmedial storytelling and transmedia is yet to fully stand on its own two – or three or four or twelve – feet.
Last September, I attended one of the most important conferences of the year for European media professionals – Transmedia Next. The three-day event took place in London – in a lovely Thames-side corporate building, halfway between the Tate Britain and the Houses Of Parliament – and was hosted by transmedia pioneers Seize The Media. It featured lectures, discussions and exercises facilitated by a full spectrum of transmedia expertise – Seize The Media’s CEO Anita Ondine, the company’s Chief Technical Architect David Beard, and its award-winning Story Architect Lance Weiler.
Providing intellectual backbone to the Transmedia Next conference was media expert and educator, Inga von Staden. She is director of the Interactive Media programme at Berlin’s Filmakademie Baden-Württemberg and also directs the MEDIA training program at the Media Business School, Spain. She works first-hand with visionary creatives who are in the process of inventing 21st century media. She has been a guiding force in moving them forward and has, as often, learned from them what transmedia really can do.
I had the good fortune to interview Inga von Staden about her work and the past, present and future of 21st century media:
Inga von Staden
NEAL ROMANEK: So how do you answer when someone asks you “What IS transmedia?”
INGA VON STADEN: I tell them it is one of several terms used in the converging media landscape. “Transmedia” was coined by Jenkins (and Kinder—NR) focusing on a story to take the user through different media platforms. The other terms currently in use are “crossmedia” which was coined by the advertising industry, referring to integrated, cross-platform campaigns. And there’s also “360Degrees” which refers to a theme playing out across a multitude of platforms and also includes factual content that may be less story-driven than fiction. 360Degrees is quite popular term in the European media industries.
NEAL ROMANEK: You started out as a filmmaker. How did you get to where you are now?
INGA VON STADEN: I began working in television and film productions in 1987. In 1995 I migrated into games and internet development as a conceptor and project manager. Then I worked as a consultant for the media industries from 1999 to 2008 helping with the paradigm shift from analogue to digital. My clients were print publishers, tv broadcasters, and also the film industry.
The more I worked with these companies, the more I became aware that there were too few professionals who could do the work that converging media implied. So in a lecture I gave at the Bertelsmann Association in 2000, I proposed we change our narrowly focused film education to a wide media education to create professionals who develop and produce content for all media platforms. My proposal was not particularly well-received at the time. But ten years later the director of my film school wrote in the school’s studies guide: “360Degrees is the new magic term.” I do not really think it is magic, but I do think it is a very sensible approach to the media business.
In 2002 I set up Germany’s first European MEDIA programme, “The Academy of Converging Media”, training authors and designers to think transmedially. And I wrote the first national studies on digital cinema for the German Film Fund in 2002 and 2003 . Today I run a four-year diploma studies programme at the Filmakademie Baden-Württemberg (www.filmakademie.de) dedicated to 360Degree Media. Our students are trained to work as transmedia producers and transmedia content directors.
Now, apart from giving lectures at conferences, hosting seminars and directing the studies program mentioned above, I coach interdisciplinary teams through the transmedial development process. It is very stressful to be suddenly doing what you have always done your way with others who do it their way. We see this even when we bring together students from different departments or schools in our Content Labs. But the innovative force that is unleashed once the communication and methodology have been practiced is simply amazing.
NEAL ROMANEK: And you became involved in Transmedia Next how?
INGA VON STADEN: For the Transmedia Next conference, I was contacted by Anita Ondine, who had heard of me through European MEDIA training programs on converging media.
NEAL ROMANEK: So is there any going back to the traditional, 20th century way of doing things for you? Or do you see yourself now permanently operating in a multi-platform world?
INGA VON STADEN: Once I began seeing ideas, stories and themes through the transmedial lense, I found it very hard not to make transmedial suggestions when involved in a development process.
NEAL ROMANEK: What are the most difficult parts of the creative process in constructing transmedia material? What are the unique challenges that are not present in producing other content?
INGA VON STADEN: To go transmedial means you have to allow for a pre-development. In other words, before you develop a format – a film, a game, etc. – you must first develop the story universe that will be the foundation on which all the different media formats will be based. This concept of pre-development is not usually taken into consideration in the development process of content. Or in the budgets for the development process. Furthermore, you have to change the process to be less linear. Transmedia is much too complex to be designed by just one person. It is a team-oriented pool process. The producer needs to bring in other disciplines to participate through the entire development process – a technical director, for example, and a community manager. And the content director needs to be educated in all media formats to understand the input they’re getting from the different team members.
Transmedial projects tend to become very big and complex. The art is to allow for all the possibilities that transmedial thinking can offer to come into the brainstorming sessions at the same time, and to structure the development process along a commonly accepted methodology, ie the “content onion” by Raimo Lang (YLE). You have to consolidate the content into an operative idea – the creative kernel – and from there build the red thread through that story universe the team has designed. The art is to keep it simple.
NEAL ROMANEK: How is today’s transmedia different from past efforts at presenting a story across multiple platforms? How does it differ, say, from what Walt Disney was doing with simpler technologies 50 years ago?
INGA VON STADEN: Transmedia is more than just crossmedial distribution. Transmedia is understanding an story or theme to be more than just a film or game or an app. It is about the notation of the story universe that an author has in his head as he writes a story. By exposing that story universe, the team members and co-production partners can share in it and become part of the creative process. They can collaborate to design a media architecture that will take that idea or story or theme across different platforms. This is not about a film going onto the internet or a character being merchandised, though that could certainly be part of the design. It is about understanding what part of the idea, story or theme plays best where and how the different media formats are interlinked, via “rabbit holes”, portholes into the various spaces within the story universe. Simple examples are having the main plot of a story happen in a TV-series and different sub-plots staged on the Internet. Another example might be opening up spaces in the story universe to users who will create user-generated media to feed into the overall content.
NEAL ROMANEK: Whenever I talk to producers about transmedia, there often seems to be the same response – and said with quite a lot of confidence: “There isn’t any good way to make money from it”.
INGA VON STADEN: Transmedia is more expensive than “simply” making a film, game, app, or building a community. But each format in a transmedial architecture will be cheaper than if it had been produced on its own. You create synergies, production resources that can be reused and reconfigured. So rather than making 5 films for the sum of X, you are creating 5 different formats for the sum of X minus Synergy.
And furthermore we are seeing film producers having an increasingly hard time to come up pre-financing. So thinking transmedially, you can create content that feeds into another platform and then cross-finance your film with those revenues.
There is no one business model. There are many business models out there. Each media platform has its models for making money. They would otherwise not exist. Now you will probably not be making money by uploading your film to YouTube – a film does economically much better in a cinema, on DVD or VoD-platform. But you may make money selling elements of your story universe on a pay-per-item basis and collecting micro payments from an online community. A transmedial producer must creatively combine the financing and revenue models out there to come up with a project’s very own business model. I call it the transmedial business mash-up model.
NEAL ROMANEK: What is the biggest potential growth area for transmedia? Entertainment? Marketing? Education? And where is the best transmedia work being done today?
INGA VON STADEN: Transmedia will most likely be making its big money with entertainment as most media does. But we are currently seeing the most interesting projects come up in the factual realm born out of the necessity of documentary film companies to find new business models in order to survive. Examples are “Prison Valley” or “The Galata Bridge”. And we have seen transmedia happen for years already in children’s media. Kids find it so easy to surf a story environment on different platforms.
NEAL ROMANEK: I feel like transmedia now is in the same place as movies were in the 20th century. Movies imitated past popular media, like novels and theatre. A lot of transmedia seems to imitate movies. How do we get away from imitating movies?
INGA VON STADEN: You soon become more creative than simply emulating the movies if you bring in different disciplines into your creative team. A game designer has a very different approach to content, as does a designer of apps or a builder of communities. Take a look at the great work Dr. Randy Pausch (creator of the Entertainment Technology Center at Carnegie Mellon University) did with his “Building Virtual Worlds” class.
NEAL ROMANEK: How can transmedia creators help each other?
INGA VON STADEN: Share experience and build a body of professional knowledge! That is the only way we can all begin making interesting projects and earn a living with them.
(originally printed in Videography magazine, Feb. 2010)
London’s Victoria and Albert Museum, generally acknowledged as the world’s largest institution of decorative arts and design, was founded by Queen Victoria in 1852. You might think that digital video technology was the last thing the grand monarch had in mind, but the V&A was originally conceived as a “Museum Of Manufactures”. From its very beginning, at the height of the Industrial Revolution, it sought to be a place for the study of the point of art and technology.
“Decode: Digital Design Sensations” is a new exhibition the V&A is putting on in partnership with contemporary arts & design organization, onedotzero. “Decode” runs from Dec. 8, 2009 to April 11, 2010, and strives in every way to break the boundaries of the typical museum experience. The Victoria & Albert Museum has been collecting digital art since the 1960’s. An exhibition called “Digital Pioneers” showcases this material concurrently with the “Decode” exhibition.
Today’s interconnected world of social and digital information, becoming familiar to even the most luddite household, has broken down traditional structures of space, and time too, in media. People are increasingly used to the idea of watching a movie when and where and how they want to, for example, and the 20th century notion that it is an experience restricted to a brick and mortar theater with predetermined show times is moving toward becoming a novelty of the entertainment.
Shane Walter, co-founded onedotzero in the mid-nineties, and has been on the cutting edge of digital media production since. He is named as the curator of the “Decode” exhibition, but thinks of himself as more of a producer, and collaborator. A couple of the pieces in the show were commissioned directly by onedotzero, with Walters taking an active interest in their production.
“Decode”, arranged more along the lines of an extended festival than a gallery show, takes for granted the 21st century decentralization of media and then tries to see into the far reaches of digital media with fluidity and change being the focal element. Not only does an audience have an array of choices regarding its relationship to the digital pieces presented in the show, but the pieces themselves are morphing and changing as a result of their relationship to the audience.
The exhibition is divided according to three themes – “Code”, “Interactivity”, and “Network” – each representing an element of the digital creative process. “Code” presents pieces that use computer code to create new works and looks at how code can be programmed to create constantly fluid and ever-changing works. It emphasizes the idea that the code itself is a kind of creative entity that makes content – and that is willing to collaborate with the audience if instructed to do so. It’s asif the original programmer had created another little artist existing in a virtual realm – The Code.
“Code nowadays is a raw material,” says Shane Walter, “A coder is as creative as someone is a sculptor, for example, or a painter or designer.”
A museum space is traditionally, almost by definition, a place where something is preserved in stasis. The Decode exhibition strives to turn that on its head. “We’re saying,” says Walter, “Yes, touch things as much as you want, take pictures, and you can download elements of the exhibition, you can upload things too. We want to make you, the visitor, a part of the exhibit.” Some of the exhibits in fact can’t exist without an audience there, and are designed to spring into motion in response to the movement of the visitors, as if the exhibit itself was an artist, interpreting its subject – the museum-goer. Golan Levin’s “Opto-Isolator II”, for example, takes the form of a human eye that stares at the viewer, and blinks and moves in direct response to the viewers blinking and movement. “A lot of these ideas and techniques have been around for some time, but I think only in the last five years have people been using them less as a novelty and more part of their digital toolbox.” ”
The “Network” portions of the exhibition focus on and utilize the digital traces left behind by everyday communications. Walters points out how the digital web that we live in is something living, connected to us in a very intimate way. “It’s almost impossible to switch off. Even when you’re asleep , your Facebook is still going, your Twitter feed. You’re leaving all these digital traces behind. So artists are trying to make sense of this area by data mining this realm, and using that data to re-represent the world.” Aaron Koblin’s “Flight Patterns” uses FAA data to create animations of flight patters that are both ghostly and dream, as well as fascinating and informative. Repeatedly in Decode, hard fact and hallucinatory visions combine to produce a revelatory experience.
DECODE also offers people the chance to actively participate with via Karsten Schmidt’s piece, “Recode”. Created as a digital identity for the exhibition, the code for the ever changing CG piece is available on an official Decode Google code page, which also has a detailed user guide and gui. New creations can then be uploaded as part of the Decode exhibits online gallery, contributions which are considered no less legitimate than the original Karsten Schmidt work.
A fascinating challenge Decode presents to the museum world is that the digital media in the exhibit is entirely fluid and in fact quite difficult to practially catalog in a museum’s inventory. If it’s constantly changing and altering, how legitimate is the description of the piece, and if viewers are an integral part of the experience, can they be said to be part of the museum’s archive?
Decode presents a fascinating array of digital wonders and fascinating experiences, types of things which might have been seen on by attendees of SIGGRAPH, but have not been offered to a wide audience. Technologies that might have languished as a mere curiosity, have now, in the hands of artists, been made to communicate – and in some cases, commune – with a new generation of media audiences.
An axon is the part of a nerve cell that sends one cell’s signal on to the next nerve cell. Without an axon, a nerve cell might be overflowing with great ideas, but its fellows will never know about them. AXON Digital Design, headquartered in The Netherlands, has specialized for years in producing hardware for the transmission, processing, and routing of audio and video signals. Now they’re becoming a key player in getting 3D signals where they need to go.
When Belgian HD company Outside Broadcast wanted to demo the live 3D transmission of an athletics event at last year’s IBC, it didn’t go to a sleek new company selling “the latest in 3D broadcast solutions”. It went to its time-tested partner AXON. Outside Broadcast were already using AXON’s Synapse infrastructure. Their challenge for AXON was not “Can we get some new technology which will allow us to do something new?” but “How can we do something new with the technology we already have?” The Memorial Van Damme, an annual summer event being held in Brussels at the King Baudouin Stadium, concurrently with Amsterdam’s IBC, provided a perfect opportunity to give professionals a taste of the future of live 3D.
TVBEurope asked AXON’s Chief Technology Officer, Peter Schut , how the company met Outside Broadcast’s needs. “Our Synapse modules are a hot-swappable solution. The infrastructure’s already there. For Outside Broadcast, we modified existing hardware to cope with their requirements, so they didn’t have to rewire anything. It’s very convenient if we can add these features to existing hardware.”
The principle requirement for the 3D transmission was to handle two simultaneous signals of left and right eye information provided by the stereoscopic camera rig. Outside Broadcast used a mirror splitter stereo camera rig – and so also added the necessary flipping of the mirrored image. AXON modified their already existing HXH150 card.
For the 3D transmission two left eye and right eye sources needed to be combined into a single SDI (HD) video stream. This was accomplished by squeezing both images down to an anamorphic half-horizontal size, then displaying them in a side by side mode compatible with MPEG transmissions.
“It was combining software blocks that we would previously use for other purposes,” Peter Schut says, “The card was designed to provide a 4:3 image with a pillar-box on each side to contain additional, external information. It was already ready to receive data from two sources. It was then just a matter of ‘moving the curtains around’ and the scaling algorithms.” His description makes it sound like something accomplished during a working lunch, but it is the AXON team’s skill and experience that allow for elegant, simple solutions, solutions that might take less established companies many hours – and many Euros – to resolve. AXON has prided itself on providing customers with smooth transitions when there are leaps in technology.
Outside Broadcast’s 3D demo was so impressive, that AXON was asked on the spot by EuroMedia for support in its camera motorcycles for 3D broadcast of the Tour De France and the Tour De Paris.
AXON are now offering a new dedicated 3D card, called the H3D100. The H3D100 will also be available in a 3GB version and both versions will have a variety of extra features. The inputs are the standard left and right eye images for stereo production, and feature flexible outputs that directly usable in a production switcher. Look for the H3D100 modules to feature prominently in AXON’s NAB 2010 presentations. These cards are currently being trialed on a few different projects, including L.A.-based video rental company Sweetwater. Sweetwater is also using AXON’s 3D cards to output stereo video for anaglyph viewing. Though anaglyph (you know, those different coloured lenses from the 1950’s) is not recommended for transmission or viewing by a large audience, but it is an inexpensive means of monitoring various 3D outputs.
Despite working intensively in providing 3D solutions to major players in European broadcasting, Schut – like many of us – has been reluctant to whole-heartedly join the 3D mania ubiquitous in the media industries. We asked him the eternal question: Will people watch 3D at home? Schut answered, “Until December, I would have said no. And then I saw “Avatar” – 3 hours of 3D without getting a headache. Now, I think Christmas 2010, if someone’s going to be buying a new big-screen TV, they’ll want it to be 3D capable. How much the viewers will be watching 3D, I don’t know. I don’t think the whole family, day in, day out, is going to be watching every programme with glasses on.”
Schut notes that, in the wake of “Avatar”, CES was a virtual stampede of 3D products and services – some perhaps creating more problems than would ever solve. AXON’s reconfiguring of its well-tried hardware may suggest that solutions to your new 3D challenges may not be far from home.
The BBC’s South Pacific – 1000 Islands, 1000 Challenges
(as printed in the July 2009
edition of TVBEurope)
Tanna Volcano in Vanuatu, South Pacific
“South Pacific” (2009) is the latest attempt by the BBC documentary team to break all previous records. The six-part series is the first to present a thorough study of a part of the world that is still relatively unexplored.
TVBEurope spoke with veteran BBC producer Huw Cordey about the production of the series. Cordey has been involved in many of the BBC’s landmark wildlife shows. He was a segment producer on “Planet Earth” (2006) and “The Life of Mammals” (2002), but “South Pacific” is his first time as producer of an entire series.
The remit of the series, narrated by Benedict Cumberbatch, was potentially as wide-reaching as the location itself. Even as broad in scope as “Planet Earth” is, it is still a definable location, but before beginning production on the series, the very definition of “South Pacific” was open to debate. “The most obvious definition,” Cordey says, “is everything in the Pacific that’s south of the equator. But that was too narrow for the series because we wanted to look at it in terms of the cultural and natural history aspects, and these things don’t always respect invisible lines like the boundary of the equator. Covering Hawaii, which is well above the equator, was entirely appropriate. It was colonized by Polynesians from the south and the animal and plant species that arrived there were carried by the same natural forces that populated the rest of the islands.”
With so many locations to choose from, literally tens of thousands of islands, over such a vast area – all the seven continents could be put inside the Pacific Ocean with room to spare – production planning had to be meticulous. Islands large enough and flat enough to feature a grass airstrip were reached by plane, but a lot of crew and equipment travel was exclusively by boat.
The series also had the challenge that very little scientific work has been done on most of the islands. Scientists simply have not gotten around to fully exploring the Solomon Islands or the island of Vanuatu.
Huw Cordey is extraordinarily well travelled. His father worked for Shell and Cordey had been all over the world long before he began his career with the BBC documentary team, yet even he was stunned by how remote the locations were. “A great portion of our locations I had never even heard of.”
The first episode of the series features the idyllic island community of Anuta. Half a square mile in area, with no harbour, and surrounded by reefs and fast current, Anuta Island is accessible only by a carefully piloted boat. From a remote port in the Solomon Islands, it took the production’s yacht – owned and piloted by the team’s cameraman – five days to get to Anuta. Cordey and his crew were dropped off, and with no place at Anuta to anchor, the yacht had to sail away to a nearby port – 75 miles away. The “South Pacific” crew were the first visitors to Anuta in two years. The previous visitors were the crew of the BBC programme “Tribe”. Cordey and his crew had an experience usually thought of as belonging to another century: “The Anutans had gotten wind of our coming only a day before our arrival because they have a VHF radio on the island which they use to communicate with a few Anutans living on the Solomon Islands.”
Filming an Oceanic Whitetip Shark, Hawaii
Surprising television audiences is never easy today, especially when the BBC has set such consistently high bars for itself. “When you’re talking about landmark television, you generally have to do two things. You either have to improve upon sequences that have been done before – either with technology or by showing better animal behaviour. Or you surprise people with something completely new. In some ways it was easy to surprise the audience because so much of it was unfamiliar.”
Varicams were used throughout the series – including the underwater sequences – with four kits shared between the crews of the six episodes. A relatively small number of cameramen were employed to cover the entire series. Having a smaller pool of photographers was more cost effective, and simplified production in not having to reinvent the wheel every time a new cameraman was introduced to the production.
“South Pacific” features all the technical sophistication of its BBC antecedents – and more. The Cineflex camera stabilization mount was used on helicopter shots, the same technology famously employed on “Planet Earth”. Cineflex mounts were rigged onto the helicopters of the obliging Chilean navy, for flights over Easter Island, a Chilean protectorate. The series features the first ever HD aerial shots of Easter Island and its foreboding statues.
Of course, the series features technical innovations used for the first time on nature TV shoots. Even before the series premiere, much buzz had been generated by online video of surfing legend Dylan Longbottom riding waves in super slow motion off the island of Pohnpei. The shots also feature the first slow motion footage showing the details and vortices of massive waves as they form and break. The images are both hypnotising and breathtaking.
Cameraman Bali Strickland with TyphoonHD4
This super slow motion footage was captured by the TyphoonHD4 camera. Like the Photron camera used on “Planet Earth” to shoot South Africa’s breaching great white sharks, the TyphoonHD4 records continuously to a hard drive cache, at extreme shutter speeds. The TyphoonHD4 is able to retain full HD resolution up to 1000fps. Its light sensitivity – essential for underwater shooting – was what made it Cordey’s choice.
Dr. Rudolf Diesel, the German mind behind the system, was asked to designed a waterproof housing for the camera’s first foray under the sea. The unit would be making its debut in big surf, with a reef two meters below, and would need to be manageable by a single, swimming operator. Diesel, both an engineer and an expert in marine biology, was literally adjusting the camera housing until minutes before shooting.
Bali Strickland, a 29 year old Australian who has shot some of the world’s greatest surfers and greatest surfing footage, operated the TyphoonHD4. In partnership with Dylan Longbottom, he wrestled twelve foot waves and the massive camera housing, and managed to capture some of the series’ signature shots.
Even Strickland, however, was apprehensive about the job. “There was very little room for maneuver,” Cordey explains. “Bali Strickland had to use all his skill to keep himself safe, and the camera too. He said to us right from the start, ‘Look, if I get in danger, I’m sorry, but I’m letting the camera go.’” To confirm just how treacherous the shoot was, Longbottom, who some call the best surfer in the world, moments after getting one of the shots, was pushed into the reef by a wave. He managed to get the surface, almost knocked out, with blood pouring out of one ear.
Rudy Diesel has developed a second generation housing for the TyphoonHD which weighs in at 11kg – as opposed to the series’ 20kg prototype. The “South Pacific” camera was also only able to capture two 2.5-second shots at 500 fps before it had to be returned to the boat, opened, and the footage downloaded. The latest iteration of the system allows almost as much recording time as batteries will allow.
One of the great messages of the series is that life is determined and will always find a way to flourish. Cordey points out “Every single one of those thousands of islands. has been colonized by something. In the South Pacific there is no such thing as a deserted island.”
(printed in April 2009 TVBEurope as Solving 3D Headaches: Matt Brennesholtz Helps Negotiate A Challenging 3D Future)
“I love watching 3D, it’s just that after 10 minutes I have a pounding headache.”
At tradeshows, exhibitions, screenings, even meet-ups of 3D devotees, one hears it over and over. At the Digital Television Group’s Summit 2009 in March, an overview of Sky’s plans for 3DTV was introduced with “Here’s Chris Johns to tell us about eye strain.”
There has been a mad rush to produce 3D content even though their may not be the viewership for it. Critics vocally wonder if the producers of 3D content are living in a fool’s paradise, preparing for The Next Big Thing that may never come. The Beijing Olympics was touted as the “3D Olympics”. 3D trials were to play in limited markets, primarily in Asia. The fact that few people have heard that Beijing was the “3D Olympics” may suggest how successful the experiment was.
Creating dynamic, believable and commercially viable 3D images is a challenge that has been around longer than most people suppose. 3D is usually associated with the 1950′s and the spate of anaglyph-based 3D feature films – although the anaglyph technique had been used to create 3D images since the 1850′s. The first stereoscopic motion picture patent was taken out in the 1890′s and the first 3D camera rig was patented in 1900.
TVBEurope talked with 3D expert Matt Brennesholtz, a senior analyst at Insight Media who has worked in partnership with the 3D@Home Consortium. The 3D@Home Consortium was formed in 2008 to speed the commercialization of 3D into homes worldwide. It also attempts to facilitate the development of standards, roadmaps and education for the 3D industry. In 2007 Brennesholtz co-authored a 400-page report “3D Technology and Markets: A Study of All Aspects of Electronic 3D Systems, Applications and Markets”. This all encompassing document forecast the viability of 3D display technology in a vast array of markets into the next decade. Its scope included not just stereoscopic 3D displays, but a variety of autostereoscopic displays, and rotating image plane, vibrating membrane, and micropolarizer technologies.
Brennesholtz is an expert in display technologies, having been a lead projection system architect at Philips LCoS Microdisplay Systems. He has a masters of Engineering in Optics and Plasma Physics from Cornell University and has been granted 23 patents. Still, we asked question most on everyone’s mind – why do we get a headache when we watch 3D?
“One of the fundamental problems with 3D displays,” he explains, “is the problem of convergence and accommodation.” Convergence is the ability of the eyes to stay trained on a point in space and allows you to focus on the text on a mobile phone three inches from your nose. Accommodation is the ability of the eye itself to focus in distance like a mini-camera.
Stereoscopic images rely on the brain’s default setting of always making a single image out of the pair of images received by the eyes – as opposed to how chameleons do it. The perceived “space” between the two side-by-side images in a 3D show is compensated for by convergence with the eyes going from being parallel towards being crossed and back – just as they would in watching a live event.
The element that is challenging for the brain – and for some viewers – is the image in a 3D display is always exactly the same distance away, on the surface of the screen. The convergence of the eyes sends the message that objects are moving forward and backward in space, but the real image each eye is capturing stays put. The brain is trying to tackle two different ways of seeing at once, like a computer running two memory intensive applications at the same time. The fact that the eyes are making very few focus changes, doesn’t mean that the brain is not revving like an engine every time it thinks something is moving toward it or away from it. Perhaps, like the trick of being able to pat your head and rub your stomach at the same time, the brain may get the hang of it with repeated viewing.
“There can be other human factor problems associated with bad 3D displays,” Brennesholtz notes, “with certain types of encoding, for example, but this is a fundamental problem that really is inescapable in the 3D display world.”
The most serious aggravation of the accommodation-convergence discrepancy is when the content creator puts images in the virtual space in front of the screen – the monster reaches out to camera, the enemy fires a hundred arrows at us, and the like. These are the effects that producers may push because they have greater visceral impact, but they are also the things that most bother the eyes. Brennesholtz says the solution is to place most 3D effects at the level of the screen or behind it.
Another significant issue, one to induce headaches in content creators rather than viewers, is that the content has to be created for the screen size and viewing distance of the intended audience. Analogous to needing different sound mixes for DVD, theatrical, and mobile device content, each 3D version of a programme must be mastered with its final destination in mind. Sound mixers have managed complex sets of presets for each intended format, it seems likely that 3D mastering will have to learn to do the same.
Although some roadblocks to the perfect 3D experience are exactly the same as they were in the 1950′s, Brennesholtz points out that the sophistication of today’s technology may overcome the others. “Some of the other problems that have been associated with 3D, like dimness or differences in brightness and color between the two images, can be overcome with proper display, screen and video signal design.”
Brennesholtz underlines the consumers demand for a quality experience that is the principle factor in adoption of 3D. “The end user, whether he’s watching broadcast television or cable or blue ray or is sitting in the cinema, is not going to give up anything to get 3D. He’s not going to give up resolution. He’s not going to give up frame rate. He’s not going to accept flicker. He’s not going to accept headaches. Basically, he wants his 2D experience – which right now when you look at HDTV is really good – but with 3D.”
Questions about 3D are in no short supply. Approximately 10% of the population are unable to properly see 3D, and what kind of a strategy must be developed when such a large segment of the audience must automatically be discounted? Most people are unaware that many TV’s are already “3D ready”, but where is the extra bandwidth going to come from if 3D TV is going to become a reality? And finally, if eyeball convergence and focus are such core issues in 3D viewing, what happens to the 3D experience after the third beer?
Gearhouse Broadcast’s new HD OB truck, called HD-1, could very well be the biggest OB truck in Britain. There is no doubt, after it finishes its transoceanic voyage next year and arrives at its destination in Australia, it will be the biggest OB truck in Australia, and probably the Southern Hemisphere.
HD-1 will be used in Australia for Channel 7’s coverage of Australian Rules Football. Kevin Moorhouse, Technical Director of Gearhouse Broadcast, says that on the matches the vehicle will be operating at about 70% capacity. But he anticipates that with the vehicle’s 28 camera capability, it will soon become a one-truck solution for 95% of Channel 7’s onsite productions.
TVBEurope toured the vehicle as it was being systems integrated for its new Australian venture at the company’s European headquarters in Watford, UK. Gearhouse Broadcast’s trucks are coach-built by A Smith Great Bentley Ltd. HD-1’s project manager is John Fisher, who has been in the industry for over 40 years. HD-1 is the sixteenth truck John has built and he will start integrating number seventeen on behalf of Gearhouse Broadcast in the New Year.
Making their truck builds long-lasting and future-proofed is vital for the success of Gearhouse Broadcast’s integration business model. All in, to build and integrate HD-1 was a multi-million pound exercise. The chassis alone takes between 26 and 28 weeks to construct. All the cable in the truck runs down a single underfloor channel in the center, rather than in the expands, so that – stationary and supported by the chassis – there is negligible wear on the cable over time. Kevin Moorhouse says of Smith’s construction, “They build trucks like battleships. It costs around three quarters of a million pounds just to build the chassis, but we expect to get ten years out of that chassis.” In fact, the group’s first truck, Unit 1, built almost 20 years ago, has just been refurbished and is still operating.
Gearhouse Broadcast made the decision to have no video jackfields on any inputs or outputs of the router in their OB vans. Given the router’s size it would be impossible to overpatch the router if it failed on a production. Also, with HD signals, the addition of a jack field’s extra connections introduce losses into the signal path. Gearhouse trucks have back up Cross-Point cards and I/O cards in case of router failure. It is this simple stripping away of everything that is not essential, while retaining and augmenting the most vital features, that has resulted in steady improvement in each iteration of Gearhouse’s OB trucks. Solutions to the puzzle of cramming three dozen workers into a confined space loaded with sophisticated technology – technology which, literally, cannot afford to fail – are solved with a simplicity and elegance.
The HD-1 seats up to 38 people. The triple expand configuration allows for unprecedented floor area. Closed for transport the unit width is 2.5 metres and will be fully within regulations for travel on Australian roads. Deployed, the 16.5 metre long truck is an impressive 7,5 metres wide –with 40 kilometers of cable inside.
The HD-1’s Pro-Bel 576 X 576 Video Router was first employed at the Beijing Olympics. The company’s ability to swap components in and out from their own inventory allows for fine tuning of their budgets – and rates for their customers. When Gearhouse has already earned money on equipment from previous shows, they can then offer such “used” technology – in this case, three months old – at more flexible pricing, if need be.
The Production section at the center of HD-1 features a unique 3-level step area. An engineering necessity was, in this case, turned into an opportunity for design innovation. The fifth wheel of the Australian rig is higher than the British standard and so required more area beneath the floor to accommodate it. The resulting steps up, allowing space for the fifth wheel, also create a tiered production area with unrestricted line-of-sight for each one of its 16 positions.
The new Sony LMD monitors Gearhouse used at the Beijing Olympics proved themselves superior in quality and resolution. Accordingly the production area was fitted out with twenty-one 24” Sony LMD 2450’s and eight 17″ Sony LMD 1750’s.
The Production area also features a fully specified Sony MVS 8400 4ME Vision Mixer, with 80 Inputs, 48 Outputs, and built-in DME.
The Vision & Engineering area, in addition to the Pro-Bel 576 X 576 Router, features 5 Sony HD Grade 1 Monitors, 24 HD/SDI External Remote Source inputs
5 HD down Converters, 10 Cross Converters, 10 Synchronisers, 4 SDI Aspect Ratio Converters and 3 HD Hex Splits.
The VTR section of the truck sports 12 six-channel EVS HD XT2’s with 4 Digital VTRs, as well as a Pro-Bel 576 X 576 HD/SDI and 256 X 256 AES Routers.
HD-1 has space for three audio engineers at a Calrec Sigma Audio Desk with Bluefin technology. The Calrec Sigma has 320 channel-processing paths, allowing up to 52 × 5.1 surround channels on one Bluefin signal processing card. The truck’s audio has 320 Channel Processing Paths, 128 AES Inputs & 128 Analogue Inputs, and 128 AES Outputs, & 112 Analogue Outputs, a Pro-Bel 256-256 AES Audio Router, and a Riedel 144 X 144 Talkback System. Also included are four Dolby E Encoders and six Dolby E Decoders.
While Gearhouse Broadcast is setting new benchmarks for OB systems integration in Australia, the company will also be flying in a new and better set of tools to Sub-Saharan Africa. South Africa-based satellite broadcaster Supersport has commissioned a flyaway kit from Gearhouse for domestic football matches. It will rival anything available in Sub Saharan Africa and top most kits available in the rest of the continent. André Venter is Head of Operations for Sub-Saharan Africa, a new position created at Supersport. He vetted several companies looking for an immediate – literally immediate – solution for 8 camera Supersports football broadcasting in Africa. The production infrastructure might vary widely from country to country and for Supersport to provide consistent, first-rate service, it would need a robust kit that could be moved and deployed quickly and easily – and they wanted it immediately. Gearhouse Broadcast was the only company who, when tasked with Supersport’s request for “immediately”, responded with “no problem”. It was able to supply a loan flyaway within a week, and then set about building the three permanent flyaways. André Ventre explained “We wanted to show the world that Africa is capable of producing high quality productions that are on a par with any broadcasters across the globe.”
The fly away kit will feature an 8-camera system made up of Sony BVP E30’s, a Sony 2.5 M/E DVS vision mixer, Teletest rack mount monitors, Harris Inscriber G1 graphics, 2 x 6 Channel SD EVS XT2, Pro-Bel router, Harris glue, RTS/Telex comms system, Yamaha DM2000K digital audio mixer, and Sachtler tripods. A wide variety of Canon lenses will go with the kit too.
Word is out across African broadcasting, and Supersport is ready to ask Gearhouse Broadcast for more. First-rate, reliable technology appearing at the right time and place has stimulated a demand for more of the same.
With the world-wide credit crisis on everybody’s mind, it is gratifying to see demand for Gearhouse’s services continuing to expand. Will the OB systems integration slice of the industry remain recession-proof? Managing Director Eamonn Dowdall says they have yet to feel the pinch and adds “When people cut down on the luxuries, their subscriptions to the premium football channels is one of the last to go.”
(NOTE: Since the publication of this article, Mogulus has changed its name to Livestream)
Internet video company Mogulus, headquartered in New York, has taken the next logical step past the on-demand model and brought no-cost 24/7 live streaming to producers. Live worldwide broadcast has at last become available to anyone with a web connection.
Ironically, live images were one of the very first “broadcast” features carried by the internet in the late 1990’s. The main handicap was that the frame rate might be four frames per minute – and require you to refresh your browser every time you wanted to see the next frame.
Mogulus and its competitors – principally Ustream.tv and Justin.tv – represent the next iteration of internet video, and Mogulus is keen on providing a service in which on-demand, linear and live streaming video are all in the same producer toolbox.
Max Haot, Mogulus co-founder and CEO, moved to London from his native Belgium in 1995. He is best known for founding the ICF Media Platform while at sports media giant, IMG Media. The ICF Media Platform was purchased by Verizon Business in 2005. In 2007 Haot founded Mogulus with Phil Worthington, a graduate of the Royal College of Art and Mogulus’s Chief Product Officer, IMG Media colleague Dayananda Nanjundappa, and Mark Kornfilt, the company’s Chief Architect. Dayananda Nanjundappa is Mogulus’s CTO and oversees the company’s second office in Bangalore.
Max Haot explains, “Most internet video platforms center around on-demand, based on the assumption that people want to watch what they want when they want. But with Mogulus our vision is to give our producers everything that a TV station can do.” “Everything” is the ability to create a live 24/7 channel that runs all the time. Currently, most of Mogulus channels are loop-based with a playlist cycle of programming repeating itself, but Mogulus content can also be schedule-based.
The Mogulus online studio features a text ticker and overlay of simple graphics or logos. A Mogulus channel can go live at any time, allowing multiple live cameras or a mix of prerecorded and live video content. Within the same player, producers have the option of offering an on-demand library of their clips as well.
“We provide the full turn-key service,” says Haot, “So a producer does not need to understand what a content delivery network is. All he needs to do is use our browser-based Studio. And then he can take our player widget and embed it on any webpage or social network.”
Last year, Axis Films co-sponsored a set of HD camera tests, in which a battery of digital cameras, from the Viper FilmStream to prosumer HD camcorders were meticulously run through their paces and tested against each other and workhorse film cameras. The last weekend in January, Axis hosted another illuminating symposium, this time on the 3D production and exhibition technology.
Axis was joined by The 3D Firm, Can Communicate, Inition and Quantel. Over 200 guests attended the two-day event, which offered a demo and ongoing workshop on the latest in 3D capture, post-production, and exhibition technologies, with particular emphasis on broadcast application.
It is difficult to open any publication about the media industry without reading or hearing someone extolling, or damning, the economic and aesthetic attributes of 3D exhibition. The showcase at Shepperton gave industry technicians and producers an opportunity to look beyond the hype and get some real facts.
First of all, if you want to sound like you know your stuff – and who does not in our industry? – avoid saying “3D”. Say instead, “stereoscopic”. The terms are interchangeable, but “stereoscopic” video is the accurate description of the medium. As with stereophonic audio, the effect relies on only two sources of information – left eye and right eye. The stereoscopic effect encourages the brain into believing it is observing objects existing in a 3D space, in the same way that a two-speaker audio system encourages the brain to believe it is hearing sounds from a multitude of sources – when it fact there are only two. The “3D” name is better suited to marketing and advertising, “stereoscopic” for the real production process.
Stereoscopic post-production has been revolutionized in the digital age. Quantel’s Pablo was on show at the Axis demo, dazzling attendees with its deftness in handling 3D editing and post. The Quantel system, used extensively in many phases of post-production, requires very little reconfiguring to manipulate stereoscopic data. Given the saturation of visual effects and compositing content throughout the industry, most post-production workstations are ready to handle stereoscopic moving images. In fact, most of 3D production and post-production is fairly unremarkable. To say it is the same as conventional production, but with one more camera, would not be too far afield.
A downside to 3D film production in past decades has been the simple mechanical challenge of getting film elements to register cleanly. Not only did negative in the camera have to register properly in order to produce the elements for a clean 3D image, but then diverse film projectors in diverse theatres had to project the two film elements in sync with precise calibration of the overlap of the two images. Digital technology now allows perfect synchronization and overlap of stereoscopic elements, which can be exhibited perpetually with the zero degradation in quality. This advance in production, and the greater standardization of exhibition parameters necessitated by digital technologies, have further opened up the opportunities for stereoscopic broadcast.
Many projection facilities have projection equipment that can accommodate 3D content, though the number of stereoscopic theatrical releases is relatively few. If 3D is to become a widely distributed feature of broadcast, wide-screen 3D releases will not provide any great percentage of the content. Sp where would a regular supply of 3D content come from?
There is a surprisingly large amount of 3D content hidden in plain sight. Shown at the Axis demonstration were 3D colour newsreels of young Elizabeth I – an example of the unique treasures hidden away in archives, some of which have remained virtually unviewed for decades. As Turner leveraged its MGM archive into one of the great cable movie channels, TCM, there are vast 3D libraries ready to be digitized for broadcast. Digital post technologies allow easy, on-the-fly cleanup of these film originals. The Quantel at the Axis presentation showed off the ease with which negative dust and scratches were erased from the digital elements of the Queen Elizabeth footage.
Today’s effects-rich media, in which even the most humble productions feature some 3D graphics or compositing work – in title sequences, at least – is another untapped gold-mine of stereoscopic content. The great open secret of 3D programming is that every frame that comes out of a 3D graphics program is ready for immediate adaptation to stereoscopic motion pictures. It already exists as a 3D image within the computer and with the term “rendering time” becoming an anachronism, outputting the POV of a second virtual camera can be done, almost literally, at no extra cost.
Also, the conversion of 2D productions to 3D is a rapidly developing specialty. At its most basic, the process uses simple, familiar compositing technologies. From the 2D footage, a background plate, and other elements of characters or foreground are created. Multiple layers of these can be manipulated along the z-axis like cardboard cut-outs in a diorama. On the other end of the spectrum are more sophisticated technologies which calculate entire, detailed 3D spaces out of existing 2D footage, which are beyond the scope of this article – for the time being.
As with most broadcast technologies which showcase visual spectacle – HD programming springs to mind – new 3D content tends to be confined to sporting events, stage performances, and nature programs. The 3D family melodrama has yet to be made. These spectacle types of entertainments are designed to directly engage a viewer on a visceral level, and the stereoscopic experience – like the HD, 5.1 surround experience – has the potential to augment that. Another, more subtle element is that these types of content emphasize the documentation of a real event – generally one in which the audience maintains a static point of view. 3D presentations can often mimic the experience of watching something from a single point of view, the illusion sometimes being interrupted when the camera begins to move. If stereoscopic production and post are not handled skillfully, a moving camera can irritate the viewer rather than enhance the 3D effect.
The elephant in the room regarding the new revolution in 3D push is: “Is it really a new revolution? Or is it the same old thing one more time?” The truth is, at least one journalist – though fascinated and inspired by the technology – left the Axis 3D presentation with stinging eyes and a headache.
Stereoscopic photography was developed in the 1840′s, on the heels of the photographic technique itself, and its basic principle has remained virtually unchanged. Much press has stated that we are poised on the edge of a paradigm shift in which 3D presentation will be ubiquitous or, some would even argue, the norm. But stereoscopic theatrical exhibition was vigorously promoted in the 1950′s and despite continuing improvements in the technology, did not take hold as many hoped it would. Is this the old saw of repeatedly performing the same actions, but expecting them to produce different results?
Despite IMAX and other big screen 3D venues, the new outlet for 3D content might well be HD broadcast. 3D LCD monitors, including the Planar StereoMirror professional display were exhibited at the Axis demo, but for consumers to trade in their HD monitors – which themselves required months of nervous window shopping and saving – for 3D monitors will require a saturation level of 3D content which, at this juncture, would seem decades away. Time-tested technologies using conventional monitors, which can be viewed with special glasses will be the standard 3D exhibition for the foreseeable future. The 3D Holy Grail of “no special glasses” – beyond a few specialty venues – will not be adopted by home viewers.
The Beijing Olympics may well be the trial by fire for 3D broadcast. The games will feature a channel dedicated to stereoscopic coverage of events. East Asia has remained at the forefront of 3D broadcast content, and it will be vital for European producers to study the behaviour of East Asian audiences and the strategies of their broadcasters. The Beijing “3D Olympics” will also be a laboratory for a dedicated 3D production workflow and 3D troubleshooting and problem solving in multiple settings.
One shocking fact presented at the Axis Films workshop might be enough to rock the foundation of every 3D business plan in the works. Roughly 8% of the population cannot see stereoscopic video. This is due to a range of factors, including partial blindness or amblyopia (“lazy eye”), focus difficulties. Whether or not a broadcast revolution can be built on a technology which immediately excludes 8% of its audience remains to be seen.
Neal Romanek is a screenwriter and journalist living in London. He attended USC’s Cinema-TV Production program and writes for a diverse collection of entertainment media publications in Europe and the USA. His official website is: http://www.nealromanek.com
When the trans-channel ferry, Herald Of Free Enterprise, capsized in March 1987, a Belgian photographer working with the BBC was handed the responsibility of organizing all aerial photography of the disaster. For search & rescue, forensic analysis, and breaking news for an anxious populace, it was essential to collect as much aerial footage as possible, and as quickly as possible. Young Wim Robberechts called every aerial camera crew in the United Kingdom and so entered a niche he would occupy for the next 20 years, and eventually come to dominate.
Wim Robberechts is owner of Wim Robberechts & Co., one of the top aerial photography equipment and services vendors in Europe, perhaps in the world. We spent a day with Robberechts and his company, based in a two-story building in the Diegem area on the outskirts of Brussels.
Aerial cinematography is like the shooting of complicated visual effects – substantial sums are spent for a few seconds of footage, a crack team operates sophisticated equipment under the microscopic gaze of panicked producers, and in the end the director takes all the credit.
In a recent example from Michael Palin’s “New Europe” (2007), the ex-Python is seen through the window of a DC-3 and then slowly drifts away in a massive pull back that dwarfs both him and the plane. The shot was captured by one of Robberechts’s young operators, Evert Cloetens Vandenbranden, using the Cineflex, which has become the gold standard for gyro-stabilized aerial camera mounts. It was the only shot Wim Robberechts & Co. did for the Palin series, yet it is likely to be one of the show’s most memorable moments.
Robberechts is keenly aware of the delicate position his – often anonymous – crew occupies. “Our job is always to serve the client. And we are always asking ourselves how to serve the client better.” Robberechts describes, without mentioning names, working with arrogant or difficult personalities, where the equipment and expertise of his company are not always put to best use. When asked how he responds to such clients, he answers by putting a finger to his lips. “If they do ask us ‘What do you think?” we will tell them. Otherwise, we keep our mouths shut.”
Wim Robberechts & Co. has long employed Wescam helicopter mounts, for years an industry workhorse, but the foundation of the company is its three Cineflexes. The Cineflex is a gyro stabilized HD camera unit that allows for rock-steady camera support on unstable or fast-moving platforms. It has been used extensively in feature films and news gathering and is a mainstay of sporting events. Viewers gasped at the recent spectacular HD aerial shots of Groupe SFP’s Tour de France coverage – captured by a Cineflex from Wim Robberechts & Co. The upcoming Olympic games in China will feature Cineflex mounts from several countries. The BBC’s “Planet Earth” (2006) was the first nature documentary to employ the Cineflex, stunning us not only with superb HD images, but intimate aerial views of wildlife which would not have been possible with previous systems.
Operated via joystick, the Cineflex consists of an HD camera system that sits in a 14.5 inch diameter ball turret in the nose of a helicopter. It is comprised of five rotating axes, three of which are gyro-stabilized, allowing use of extremely long lenses which would be impossible to keep stable in a standard mount. Compared to bulky 35mm film camera systems, the Cineflex is fairly lightweight at about 85 lbs. The convenience of shooting to HD allows an aerial crew to stay in the air and stay shooting for much longer. Few are the producers who would go back to using film on an aerial shoot after capturing to HD.
One thing Wim Robberechts has learned through his years in the aerial photography business is practicality, and perhaps there is also a kind of native Flemish prudence at work. He has been able to capitalize on challenges and thrive while seeing many of his contemporaries and competitors fall by the wayside. He has no plans whatsoever to own and operate his own helicopters. “It would be sexy to have our own helicopter as well, but then we become competitors with our friends.” Relationships with pilots and helicopter operators have been honed over long years of working together. Robberechts recognizes that expanding into every single niche of the aerial photography business would end up erasing those existing networks and do the company more harm than good. He says, very simply, that the companies he has seen drop like flies around him almost always “have decided to spend more than they could bring in.” This most basic tenet of business is understood by most business owners, but is actually practiced by a very few. Robberechts is one of those few.
Though Robberechts is himself a broadcast industry veteran, he deliberately employs a youthful team of technicians, some straight out of Belgium’s top film school, to help keep his edge sharp. “Some of these directors, the ones with the half-shaved face and expensive sunglasses, they are not going to speak the same language as me.”
Robberechts is invited to give regular lectures at the Brussels Film School and when there, he keeps an eye out for new talent. His years of experience have dictated a clear, hard-line set of criteria for potential applicants. “You must be able to speak at least three languages, and be willing to work for little money for two years. And say goodbye to any girlfriend or family life.” The training is intensive and all done in-house. “For the first two years, it costs more money to train a new operator than he brings into the company.” The commitment level must therefore be very high and Robberechts accepts nothing less than 100% commitment.
The beginning of a technician’s training might involve little more than riding in the chase van during the filming of a bicycle race, and might culminate with a first aerial shoot of power lines commissioned by the local government. Young company technician Evert Cloetens, an employee still in the middle of a long and steep education, earned his first solo shoot at Torino, shooting the downhill skiing. The Cineflex was mounted on the CAMCAT remote control cable camera system. While another company’s technician handled the CAMCAT, Evert, seated at controls beside the CAMCAT tech, captured the HD footage with the Cineflex. Evert is also an enthusiastic skydiver and skydiving camera operator, but, at present, Robberechts has no plans to add skydive photography to his rate sheet.
Operator Bas Vandenbranden came directly out of film school to join Wim Robberechts & Co. In addition to having the “right stuff”, he had a passion for remote control model aircraft. Some of Bas’s early years were spent rigging up timer-set Polaroid cameras to small balloons – then chasing the Polaroid photographs they it floated, leaf-like, back to earth. He has also just returned from shooting San Diego’s Red Bull Air Race with the Cineflex.
Robberechts briefly expanded with the addition of a Paris office, but he quickly abandoned the foray. The current situation in Brussels was hard to improve upon. Brussels is, if you include English, a tri-lingual city. Its designation as the economic hub of Europe puts it at the financial and political center of things, and its geography allows rapid, easy access to Britain or anywhere in continental Europe via air or Belgium’s straight, wide roads.
Technologies like the Cineflex demand a new way of employing aerial photography. It is no longer sufficient to show viewers high resolution images shot from a great height without also giving attention to aesthetics. With the dollar weak, Europeans are buying up the American-made Cineflex at a tremendous rate. As the technology becomes ubiquitous, the art of the aerial shot – its beauty, its dramatic context, its resonance – will come to the fore. Robberechts’s crew are likely to exchange the designation “operator” for “artist”.