Our Evolution of Storyboarding

May 9, 2012

When our current team solidified at the very beginning of Morehead Production, we came from very different backgrounds. Jay, a designer and journalist; Pete, a motion graphics and compositor; and myself an animator. The one thing we did all had in common however was we had never done dome work. We have since learned from our mistakes starting out, and taken new steps to refining our processes to making dome shows. Among those changes has been our storyboarding.

Earth, Moon, and Sun

This was our first show that we made, so our storyboards are closer to those for flat screens than domes. We saw there was a challenge of how to draw a storyboard accurately for the dome and tried to illustrate it, however once production and animation started it was clear the boards were very limited and ultimately not very useful.

Magic Tree House

Our second show, Magic Tree House, is an adaptation of the original analog show we had here at Morehead Planetarium. With the Zeiss’ retirement drawing closer, we were tasked to make a digital version of this show. Although nearly every scene was simply updated with better visuals, there was a sequence which we were able to re-imagine. This is where we thought to introduce the idea of drawing the storyboards in the dome itself. We found it was very successful, and there were no surprises when it came time for actual production because the composition of what you saw in the storyboards could translate directly to what we’d do in production. It took a little getting used to drawing with the distortion, but it was ultimately worth it.

Solar System Odyssey

After the success of storyboarding in the dome for Magic Tree House, it made the most sense to continue that practice with Solar System Odyssey. A challenge presented itself however, which was the element of a character driven show. In all our previous productions, there were either a small appearance of one character, or no characters at all. Solar System Odyssey had a staggering three characters on screen at nearly all times. It was because of this that we had to think and draw not only where the characters are on the screen, but also where the camera is going to give the best staging direction for our characters. This was where we introduced the idea of our ‘dance chart’, which we would make for each scene with characters. I also called it the Law and Order effect (in respect to one of my favorite shows on television).

The Longest Night

Advances and evolutions we’ve been making in storyboard development took a strange turn for this show, The Longest Night. Not only was it to be our first show with significant amounts of live action in it, but it would be a collaboration between us and Paperhand Puppet Intervention. There were plenty of challenges already with storyboarding a hybrid of live action and digital environments, but added to that is taking the script and boards they drew and adapting it for the dome.

This is an example of one of the boards that Paperhand created before it was taken to the dome. We could get an idea of what was wanting to be seen, but we needed to be able to see it on the dome to get a better sense of scale and placement of the characters and environments. Using what we learned from shows in the past and adapting the dance chart to include ‘real world’ camera and ‘digital’ camera placement, we came up with an example seen here.

As I said in the title, this is our evolution of storyboarding. There will never be an end to the changes we make in our process as we learn and grow in the dome field. We will just keep applying and adapting what we have learned in the past to our work in the future.

Collaborating with the non-digital

May 4, 2012

In the first four fulldome shows that Morehead produced, we didn’t have to do much collaboration with outside groups. We’d sometimes contract out a writer or the composer, but for the most part, our productions were created almost completely in-house. That all changed when we met with Donovan Zimmerman from Paperhand Puppet Intervention and decided to collaborate on our newest show in production, The Longest Night. Paperhand had been putting on stage performances in North Carolina for over 10 years and we loved their style, aesthetic and message. But collaborating with a completely non-digital group presented some challenges.

The basic plan was for Paperhand to write the script, we’d take it and produce storyboards and pre-viz. After that we’d shoot live action of Paperhand’s giant puppets on green screen, do the digital production and finally, Paperhand would score the show. Every step of the way we’d make sure to COLLABORATE, meaning that we weren’t just dividing up the tasks, but we were bouncing ideas off each other and making sure we were both happy every step of the way. Easier said than done.

The first challenge was that Paperhand had never been involved with a dome production. In fact, they’d never been involved with a film production of any of their shows. So we were starting from scratch. The first thing I like to do is to explain the basic steps in making a show:

1. Concept & Script

2. Storyboards & Concept Art

3. Animatics/Pre-Viz & Scratch Audio

4. Production (film and 3D) & Sound FX/V.O.

5. Music, Narration and Post

But another thing I like to show to newbies is this great video from Cirkus Productions out of New Zealand about the ABCs of the animation process.

Once we were fairly certain they wrapped their head around the basic process, we had to make sure that they understood the differences between STAGE, SCREEN and DOME productions. Luckily, in some ways, a dome production with live actors is actually more similar to a stage production than a film. It’s just that the stage surrounds the audience and we can move from scene to scene much more quickly.

As you know, you don’t cut on the dome like you do in a film. Instead, you need to draw the viewer’s eye to what is important using other techniques. Paperhand had plenty of experience working in that manner. So once we had convinced them that it wouldn’t work to cut to a close up of an object or character and got them back using their old school techniques, things worked much more smoothly. Oddly enough, this was one situation where the less “film” knowledge worked out to our advantage.

What they weren’t prepared for is the limitations of the color green when shooting on a green screen. It’s tough when you’ve been making puppets for 15 years with green in them for someone to tell you that you can’t use that color. If all else fails and they ended up having green on their puppets, we could have painted the room blue.

The next thing we had to explain to our new non-digital friends was the beauty of digital magic. At first, they assumed that when creating concept art or pieces for the production, they’d have to make them life sized. They also thought that if we needed 200 trees, they’d have to paint 200 trees. They were happy to hear about the copy and paste functions in After Effects. They were also happy to realize that we could tweak colors very easily in the system without them having to paint new versions. The great thing is that we ended up with a lot of fantastic concept art. Here are some examples:

Early Concepts of The Longest Night

May 1, 2012

Early on in the planning stages of “The Longest Night,” we realized there were going to be some big changes to how we approached production. Typically we work with animated CG characters and environments. We can dictate actions and are in charge of a camera that is essentially unlimited in its range of motion.

With this production we knew we were going to need to actually capture some performances and integrate that footage into some sort of environment. An example of what we thought would stylistically work with the look and feel of a typical Paperhand Puppet Intervention show is the Modest Mouse video “Float On”.

The flat nature and stylistic treatment of the video seemed to be flexible in the sense that it didn’t have to make physical sense and could be executed with static camera green screen shots. The footage could be mapped to flat cards and be moved around. We shot some test footage and made a quick proof of concept.

We found this technique to be functional but limiting.

We eventually realized that we could put movement into the green screen footage and match-move it. Match-moving is a technique in post-production where points are calculated out of the footage and software can use the points to map out digitally what the camera is doing in real life. Then we can use that data to animate a fisheye camera in 3D space where we could do more dynamic and realistic uses of footage in relation to the camera. This was a theory that we decided to test out.

You can see the final result of our test below.

Below is an earlier version where you can see the original footage before it was keyed out.

We matched the real world footage to a digital background. This rough test was the foundation for the work we would then build upon for designing the rest of the show.