Welcome to AOTG.com

Sign up free

Be a part of a unique online community that connects post production professionals and film academics worldwide.

You'll have access to personalize your news feed, access to Live Post Talks and much more. Contribute to the community by posting interesting post production content.

Member Login

Social Login!

Not a Member? Sign Up!

(minimum 3 characters)
I agree with the Terms of Use and Privacy Policy
To receive account info and prevent it landing in your spam folder, add info@aotg.com as an email contact.


Making VR:
The Good, the Bad, & the Ugly

Written by: Kenzie Audette (Post/VR Production Assistant)
FRONTLINE
Twitter: @frontlinepbs

With the release of FRONTLINE'S latest 360 project, "Night of the Storm", Kenzie Audette reveals the post production process for a 360 project.

In the spirit of helping other journalists and filmmakers dive into this emerging medium, I want to break down the good, the bad, and the ugly of virtual reality – as well as some exciting new technology coming down the pipeline.

For the last year at FRONTLINE, the investigative documentary series on PBS, we have been creating 360 VR documentaries. We’ve transported viewers to the heart of the fight against Ebola, taken them on a critical mission to deliver food in South Sudan and offered a rare on-the-ground look at what’s left of Chernobyl. This past Sunday, we released our latest 360 film – Night of the Storm – the harrowing story of a family hit hard by Superstorm Sandy.

As Raney Aronson, our executive producer says, "long-form, linear documentary journalism will always be at the center of FRONTLINE and in our DNA, but we are also challenging ourselves to innovate." And we certainly have been challenging ourselves with virtual reality. Here’s some of what we’ve learned.

Before you set-off to make a 360 video, these are the basic questions you have to ask yourself:

1) What’s your story? And why VR?

2) Where do you plan to publish?

3) Are you doing 360 video or VR? Don’t know the difference? I’ll explain in a second.

4) 2-D (monoscopic) versus 3-D (stereoscopic)? Have I really lost you now?

You must always start with a good story.

At FRONTLINE, you’ll commonly hear us talking about: What story do we want to tell? How do we tell it? And how do we put it out in the world? It’s no different with VR. As our coordinating producer, Carla Borrás, who helps run our VR efforts often says, “right now VR is shiny and new, but as the novelty wears off, story will be king.” So it’s important to think about why your story will be better told in an immersive 360 way.

Next up, let’s talk publishing, as this can inform how you make your video.

The good: publishing your VR content has become relatively easy. Both YouTube and Facebook support 360 videos. So do some newer players -- such as VRideo, which aims to offer a user-friendly VR focused platform. For YouTube, aside from some simple tweaks it’s as easy as uploading a regular video. The bad: there are some limitations. For example, Facebook will only take a file that is less than 10 minutes, while YouTube requires you to inject metadata into your file and is generally a little more complex of a process.

Then there is walk-around VR, which gets a bit trickier to produce. Some of the high-end headsets like the HTC Vive or Oculus Rift allow you to move around within a VR experience. (We're actually working on our first walk-around VR experience with Emblematic Group (but I'll save that for another blog.)

Now that you’ve got a platform in mind, you'll have to decide whether you want to create a 2-D (monoscopic) or 3-D (stereoscopic) experience. Stereoscopic is when each eye receives a slightly different image – creating a 3-D effect. It’s worth noting that this takes double the cameras to achieve. Monoscopic is when each eye receives the same image, still quite effective at creating a sense of immersion, but not technically 3-D.

Stereoscopic sounds like more fun, right? But remember, stereoscopic means you have double the cameras, which means double the chances for a camera to be out of alignment, or to run out of battery or for something else to go wrong.

For our first VR endeavor, Ebola Outbreak: A Virtual Journey, we chose to shoot stereoscopic and used a 12-camera Go Pro rig. We felt that while there was some value to the 3-D effect, the stitching and post-production hassle didn’t justify it.

Which brings me to stitching, aka, the ugly.

Stitching is the hardest and most time consuming part of the process. The more cameras used, the more stitch lines have to be joined. It can be a game of whack-a-mole to try to get everything stitched together flawlessly. You can be getting one side in focus all the while pulling the other side completely out of alignment. There is also a certain distance your subjects should be away from the camera -- too close and you get some people popping in and out of the scene. Too far and the resolution of the image could be a problem, making it hard to read a subject’s facial expressions. It’s best to have your subject be at least five feet from the camera and directly facing one of the cameras to avoid being over a stitch line.


Given the complexities involved in stitching six cameras together, you can imagine why you might want to avoid the extra hassle. In general, it can be more difficult to fix your mistakes. This is one reason why we've been returning to monoscopic for our current VR pieces.

Now I’ll get to the really good stuff! There are some very exciting new cameras coming to market that will make it much easier for everyone (even a novice) to create 360 films.

This April I attended National Association of Broadcasters (NAB), a technology summit where tens of thousands of industry professionals gather to discover the new technologies entering the market. There were two cameras that stood out for me.


On the consumer front, there is a $900 Kodak setup. It has two cameras mounted back-to-back, each with a 235 degree field of view, which lets you have a lot of overlap on the two cameras, which helps in the stitching process. This seems like an ideal fit for most people trying to get into creating quality content on a budget. The limitations of this camera are its slightly low frame rate at 30 fps and the fisheye lenses that aren’t great for close up shots. You are also shooting at a compressed format. Unfortunately, most current cameras only shoot at 30fps. For high motion shots having a higher frame rate is great for reduce motion sickness and increasing immersion.

The other camera that impressed me is geared more to a professional market. The $4,000 Orah 4i is a four-camera option with slightly less fisheye lenses, which will help with the up-close footage (less warped effect). It also has 4 microphones for capturing positional audio. Unfortunately, this camera doesn't ship until October. Still, it's an interesting solution to the problem of viewing your content in the field as it can stream directly to a gear VR headset.

What do you get for this extra cash? Well, the live streaming option to a VR headset is a boon to filmmakers who want to make sure their shot is going to work before wasting time in post-production. It is also a single cable solution that feeds to a processor box (used for stitching). It takes power directly from a socket, so no batteries are needed, and it has the ability to shoot for much longer durations than the Kodak. Our Kodak last for about 40 minutes before overheating.


Also, the warped nature of fisheye lenses makes it easier to shoot your subject farther away from the camera. Having double the cameras as the Kodak, each camera on the Orah has a smaller field of view but a cleaner image.

I mentioned positional audio above and my take home from NAB on this front is that while very essential to a quality experience, it is not there yet in a practical form. There was an offering from Sennheiser that arrives in the fall. It was a four mic setup that records 360 audio. The bad news is Facebook doesn’t support codecs that allow for positional audio.

(Here is a great article to help you wrap your head around the audio possibilities)

Another discovery I had was not at any particular booth, but in my discussions with other VR enthusiasts. One way to get positional audio in your video is to use a game engine to connect sounds to “objects” in a video. Using some post processing effects like reverb can help simulate the effect of 3-D audio. This seems like an easier solution to making VR audio a reality. Very similar to the method used to make 3-D audio in video games, the technology is already here and can be repurposed.

A lot to take in, I know, but we haven’t even started on the post-production! There are two big players in this space -- Kolor Autopano and Videostitch. Both are good. Kolor got bought by Gopro and will likely start optimizing its software for the Gopro VR rig that is coming out for a more seamless experience. Whereas Videostitch has its own camera, the Orah 4i.


Both seemed to have a bit of trouble with the occasional shots from Gopro camera rigs. To help fine tune your stitch, you can bring in another program, PTGui. This is photosphere software that allows for a more precise stitch of a single frame. The metadata can then be added to the video stitch software.

Once you feel comfortable with your stitching you can begin editing! There currently is no way to actually edit in VR although I could see this being really revolutionary for the medium. The footage is treated like any other 4k video, It will have quite a warped effect, and can take some back and forth exporting it to the headset to get a feel for how it looks.

There are also the graphics to think about. We use After Effects for our broadcast graphics and were delighted to discover a smaller company, Mettle, makes a plugin for After Effects that not only lets you treat text in a 3-D space, but can also be used to create entire graphic VR experiences (we’ll be publishing one of those later this summer).

Lastly, a definite highlight for me at NAB was a speech from the CEO of Litro Jason Rosenthal on their new light field camera. The camera is definitely geared more toward traditional filmmakers, however, it was a glimpse of a new way to think about film. It captures light giving you a lot more metadata to work with. (A link to a great article on it is here)

VR is very much still in its infancy, but with the incredible investment from some of the biggest tech companies In the world, people are truly taking notice. A lot of the time I get asked how VR compares to 3-D. Will it be just a passing fad? The reason I feel VR is on a different level lies in the experience. People pay for experiences that provide a different perspective and I’ve never had that feeling with 3-D TV’s. Whereas with VR, I feel like a child exploring a new world full of wonder and personally, I am hungry for more.

Make sure to check out FRONTLINE'S latest 360 project "Night of the Storm" 360 here!





comments powered by Disqus
SUBMIT A LINK

Newest From Aotg.com
Stay Informed
  • Mail List

    E-Mail Newsletter

    Choose what Post News gets sent directly to your E-Mail, daily or weekly.

  • Apple iOS Mobile App

    AOTG App for iOS

    Get your post news on your favourite Apple device, when you want it where you want it.

  • Android Mobile App

    AOTG App for Android Devices

    Get your post news on your favourite Android device with the AOTG Android App.