Skip to main content

IPS and Domefest

July 24, 2012

Some of us at Morehead are heading down to Baton Rouge this week for IPS and Domefest and we’ll be up on the big screen a lot. Join us in Sky-Skan’s Zendome (the biggest dome in the dome village) at 3:15 on Wednesday for a screening of Solar System Odyssey. Clips from two of our shows also made it into Sky-Skan’s Best of Fulldome reel during their main presentation. Lastly, we’re happy to announce that Jeepers Creepers is a Juried Finalist and on the Juried Show Reel for Domefest this year. See you there!

Our Evolution of Storyboarding

May 9, 2012

When our current team solidified at the very beginning of Morehead Production, we came from very different backgrounds. Jay, a designer and journalist; Pete, a motion graphics and compositor; and myself an animator. The one thing we did all had in common however was we had never done dome work. We have since learned from our mistakes starting out, and taken new steps to refining our processes to making dome shows. Among those changes has been our storyboarding.

Earth, Moon, and Sun

This was our first show that we made, so our storyboards are closer to those for flat screens than domes. We saw there was a challenge of how to draw a storyboard accurately for the dome and tried to illustrate it, however once production and animation started it was clear the boards were very limited and ultimately not very useful.

Magic Tree House

Our second show, Magic Tree House, is an adaptation of the original analog show we had here at Morehead Planetarium. With the Zeiss’ retirement drawing closer, we were tasked to make a digital version of this show. Although nearly every scene was simply updated with better visuals, there was a sequence which we were able to re-imagine. This is where we thought to introduce the idea of drawing the storyboards in the dome itself. We found it was very successful, and there were no surprises when it came time for actual production because the composition of what you saw in the storyboards could translate directly to what we’d do in production. It took a little getting used to drawing with the distortion, but it was ultimately worth it.

Solar System Odyssey

After the success of storyboarding in the dome for Magic Tree House, it made the most sense to continue that practice with Solar System Odyssey. A challenge presented itself however, which was the element of a character driven show. In all our previous productions, there were either a small appearance of one character, or no characters at all. Solar System Odyssey had a staggering three characters on screen at nearly all times. It was because of this that we had to think and draw not only where the characters are on the screen, but also where the camera is going to give the best staging direction for our characters. This was where we introduced the idea of our ‘dance chart’, which we would make for each scene with characters. I also called it the Law and Order effect (in respect to one of my favorite shows on television).

The Longest Night

Advances and evolutions we’ve been making in storyboard development took a strange turn for this show, The Longest Night. Not only was it to be our first show with significant amounts of live action in it, but it would be a collaboration between us and Paperhand Puppet Intervention. There were plenty of challenges already with storyboarding a hybrid of live action and digital environments, but added to that is taking the script and boards they drew and adapting it for the dome.

This is an example of one of the boards that Paperhand created before it was taken to the dome. We could get an idea of what was wanting to be seen, but we needed to be able to see it on the dome to get a better sense of scale and placement of the characters and environments. Using what we learned from shows in the past and adapting the dance chart to include ‘real world’ camera and ‘digital’ camera placement, we came up with an example seen here.

As I said in the title, this is our evolution of storyboarding. There will never be an end to the changes we make in our process as we learn and grow in the dome field. We will just keep applying and adapting what we have learned in the past to our work in the future.

Collaborating with the non-digital

May 4, 2012

In the first four fulldome shows that Morehead produced, we didn’t have to do much collaboration with outside groups. We’d sometimes contract out a writer or the composer, but for the most part, our productions were created almost completely in-house. That all changed when we met with Donovan Zimmerman from Paperhand Puppet Intervention and decided to collaborate on our newest show in production, The Longest Night. Paperhand had been putting on stage performances in North Carolina for over 10 years and we loved their style, aesthetic and message. But collaborating with a completely non-digital group presented some challenges.

The basic plan was for Paperhand to write the script, we’d take it and produce storyboards and pre-viz. After that we’d shoot live action of Paperhand’s giant puppets on green screen, do the digital production and finally, Paperhand would score the show. Every step of the way we’d make sure to COLLABORATE, meaning that we weren’t just dividing up the tasks, but we were bouncing ideas off each other and making sure we were both happy every step of the way. Easier said than done.

The first challenge was that Paperhand had never been involved with a dome production. In fact, they’d never been involved with a film production of any of their shows. So we were starting from scratch. The first thing I like to do is to explain the basic steps in making a show:

1. Concept & Script

2. Storyboards & Concept Art

3. Animatics/Pre-Viz & Scratch Audio

4. Production (film and 3D) & Sound FX/V.O.

5. Music, Narration and Post

But another thing I like to show to newbies is this great video from Cirkus Productions out of New Zealand about the ABCs of the animation process.

Once we were fairly certain they wrapped their head around the basic process, we had to make sure that they understood the differences between STAGE, SCREEN and DOME productions. Luckily, in some ways, a dome production with live actors is actually more similar to a stage production than a film. It’s just that the stage surrounds the audience and we can move from scene to scene much more quickly.

As you know, you don’t cut on the dome like you do in a film. Instead, you need to draw the viewer’s eye to what is important using other techniques. Paperhand had plenty of experience working in that manner. So once we had convinced them that it wouldn’t work to cut to a close up of an object or character and got them back using their old school techniques, things worked much more smoothly. Oddly enough, this was one situation where the less “film” knowledge worked out to our advantage.

What they weren’t prepared for is the limitations of the color green when shooting on a green screen. It’s tough when you’ve been making puppets for 15 years with green in them for someone to tell you that you can’t use that color. If all else fails and they ended up having green on their puppets, we could have painted the room blue.

The next thing we had to explain to our new non-digital friends was the beauty of digital magic. At first, they assumed that when creating concept art or pieces for the production, they’d have to make them life sized. They also thought that if we needed 200 trees, they’d have to paint 200 trees. They were happy to hear about the copy and paste functions in After Effects. They were also happy to realize that we could tweak colors very easily in the system without them having to paint new versions. The great thing is that we ended up with a lot of fantastic concept art. Here are some examples:

Early Concepts of The Longest Night

May 1, 2012

Early on in the planning stages of “The Longest Night,” we realized there were going to be some big changes to how we approached production. Typically we work with animated CG characters and environments. We can dictate actions and are in charge of a camera that is essentially unlimited in its range of motion.

With this production we knew we were going to need to actually capture some performances and integrate that footage into some sort of environment. An example of what we thought would stylistically work with the look and feel of a typical Paperhand Puppet Intervention show is the Modest Mouse video “Float On”.

The flat nature and stylistic treatment of the video seemed to be flexible in the sense that it didn’t have to make physical sense and could be executed with static camera green screen shots. The footage could be mapped to flat cards and be moved around. We shot some test footage and made a quick proof of concept.

We found this technique to be functional but limiting.

We eventually realized that we could put movement into the green screen footage and match-move it. Match-moving is a technique in post-production where points are calculated out of the footage and software can use the points to map out digitally what the camera is doing in real life. Then we can use that data to animate a fisheye camera in 3D space where we could do more dynamic and realistic uses of footage in relation to the camera. This was a theory that we decided to test out.

You can see the final result of our test below.

Below is an earlier version where you can see the original footage before it was keyed out.

We matched the real world footage to a digital background. This rough test was the foundation for the work we would then build upon for designing the rest of the show.

Producing our newest show – The Longest Night

April 18, 2012

We’re right in the middle of production on our newest show, tentatively called The Longest Night: A Winter’s Tale.  The show is being created in collaboration with Paperhand Puppet Intervention, a talented crew of people who normally produce live theater with giant puppets, masks, stilt dancing, rod puppets, shadows or silhouettes, and anything else they think will work. They’re wildly popular in our part of North Carolina and we like their stage shows so much that we thought it would be great to put them up on the dome and send them around the world.

For the past six months we’ve been developing ideas, writing scripts, creating storyboards and pre-visualizing a show unlike anything that’s been seen up on the dome. It’s an experiment, that’s for sure. But as it develops, we’re getting more and more excited about it. In the weeks to come, we’re going to be posting updates on When In Dome about the ideas, the process and the people involved. Stay tuned…

Morehead heading to IMERSA Summit

January 27, 2012

 

 

 

 

 

 

 

Next week we are heading out to Denver for the Feb 3-5 2012 IMERSA Summit. This year’s theme is “Lessons from our past, Visualizing our future: Winning solutions for the digital dome.” We’re screening our new show, Solar System Odyssey, at 5:30PM on Friday and then I’ll be giving a presentation directly afterwords entitled “Domenclature” about the need to create a “film language” for the dome and what we learned producing SSO. If you’re going to be there, please stop by, see the show and say hi.

Looking to the Spring, we’ll be also screening Solar System Odyssey at the Jena Fulldome Festival in Germany in May and our short, Jeepers Creepers, at the Buenos Aires Independent International Festival of Cinema in Argentina in April.

Defining “fulldome” to a layperson

January 24, 2012

I was recently at a non-planetarium, non-fulldome conference for science communicators called ScienceOnline. The attendees that I met, who happened to be mostly scientists, science journalists or pr people, generally didn’t know what I meant when I said I “produced fulldome video.” As many of us have experienced, saying that you make “planetarium shows” doesn’t quite work either because most adults tend to think about pre-digital shows. It’s a good thing I attended a session called “Pimp Your Elevator Pitch” and decided to use it to work on giving a definition of what fulldome is in less than 45 seconds.

Here’s what I ended up with:

Fulldome videos are primarily science documentaries that are projected onto a domed surface, typically in a planetarium. Many fulldome videos deal with astronomy, but other subjects are appropriate for the dome, especially topics or environments that are difficult to experience as a human being, such as deep underwater, inside the human body or in the future. We like to think of a flat screen video as a window into another world but with a fulldome video you can poke your head up inside that world and become immersed within it. Think of a 3D animated movie crossed with IMAX and put it in a planetarium.

Some feedback I got with my original pitch was that I started by saying that they’re “not planetarium shows,” which instantly put the idea of an analog show in people’s minds. I also originally described them as a combination of PIXAR and IMAX in a dome and was told that people in very rural areas might not know what PIXAR or IMAX are. Something to keep in mind.

Any other ideas out there? How have you described “fulldome” to others quickly?

Advantages of the Dome-AFL shader

January 23, 2012

When we started producing dome content 4 years ago, we were working on two different 3d platforms, 3ds max and Maya, and still doing a 5 camera stitch with a hemi-cube. We used the 5 camera stich to create our first two productions, “Earth Moon and Sun” and “Magic Tree House.” On our most recent production, “Solar System Odyssey,” we knew we wanted to try something different. Since we were doing a character driven piece, I took it upon myself to learn Maya. One of the greatest achievements in our recent production was the proper implementation of the DomeAFL shader for mental ray, created by Daniel Ott.

This opened up new doors for rendering and camera techniques. The reduced time of manually stitching together comps freed us up to try and tackle more challenging aspects of production. One of the new features we’d be able to render was an Ambient Occlusion pass that gave our elements new depth.

We no longer were fighting to fit disjointed pieces together before running out of time, but instead were able to refine our work from a rough state to a more polished product.

 

Recently we upgraded our software from Maya2008 to Maya2012. In that upgrade the shader stopped working. Fortunately, I was able to locate an updated version. The work these fine folks are doing is taking the shader to new dimensions by creating stereoscopic imagery (via Roberto Ziche on http://fulldome.ning.com/forum).

 

2D Shake in After Effects

January 5, 2012

In a previous post Jim talked about doing a believable shake on the 3D camera itself. With motion blur turned on this can get a bit expensive as far as render times. Sometimes we lean on After Effects to push a shake to even greater extremes.

In this example you’ll see a 2D shake added to enhance the launch sequence. Now on the flat screen the shake doesn’t seem to be all that extreme, but on a dome it feels much more intense. In the last shot of the sequence I did a 3d Camera shake, and felt it needed to be pushed more. Rather than re-animate, we used After Effects and did a 2D wiggle on top of the existing shake to get the desired look.

I do this by using the Wiggle Expression in After Effects. [wiggle(a,b)] where a= frequency of the wiggle per second, and b= how much or amplitude.

 

I link them to sliders so I can animate how much wiggle I want. Now that I have a wiggler ready to go, I wiggle a null. The location of the  null will be the center point of the wiggle. Once you’re ready to go, parent your footage to the null.

Now depending on how comfortable you are with After Effects I might have lost you. So feel free to watch the following tutorial about wiggle, and its various uses.

Solar System Odyssey flat trailer

January 4, 2012

We just rendered a flat screen version of the trailer for our newest show – Solar System Odyssey. Looks pretty good in a rectangular format, if I do say so myself.  Check it out below. But you’ll have to check it out on a dome to get the full effect, obviously.