Useful modeling script for Maya

November 28, 2011

For a long time I was a 3ds max user, and only in the last year have I switched to Maya. One tool that 3ds max had that was incredibly useful for building hard surface or objects that repeat themselves was the array function. Thankfully, I found a script developed by Ed Caspersen that brings this functionality into Maya.

http://www.creativecrash.com/maya/downloads/scripts-plugins/utility-external/copying/c/array-for-maya

I used this tool to produce the following model of a launch tower in less than 2 hours.

With this you can build a small section of geometric detail and control how it is replicated in any direction, and even set it to get gradually smaller. Working in a 4k immersive format, you can only get so close to textures before you start to see the individual pixels or see the resolution soften. Having the extra geometry helps break up the visual landscape and make up for those instances where textures start to fall apart. It’s perfect for building repeating shapes quickly and adding the much needed detail that the fulldome format demands.

 

How close is too close?

November 16, 2011

One of the dangers we run into during our productions has been object distortion. It’s most frequently seen when you fly towards or away from a moon or planet. That dreaded bulge is caused by the closest part of the sphere being much closer and therefore much larger than the farther parts of the surface. We have been actively trying to avoid these situations in our shows, as it tends to break the illusion of immersion. Sometimes, however, it is unavoidable, either through demand of the script or storyboards. It is in these cases that we try to make these close-to-camera actions happen as quickly as possible so as not let the mind start to think, “Boy, that really looks strange!”

Here’s an example I quickly threw together showing various distances.

How to decide what to build

November 11, 2011

Designing models and assets to be used in Fulldome video requires you to think about a combination of variables.

  1. How long will it be on screen?
  2. How fast is it moving?
  3. How close will it be to the camera?
  4. How many times will we use it?

We developed this method of evaluation after our first production, during which we learned the importance of pre-visualizing our 3d scenes before even building our models.  The following is a lunar vehicle we designed and built custom for a sequence on the Moon. In the final shot, the vehicle was seen from a long distance aerial flyover and was on screen for a short time.

 

With the time and resources put into this model, we will re-purpose it for another show, should the need arise.

The importance of previz in fulldome production

October 24, 2011

Before diving in, I realize that some of you may not have even heard of the word “previz”. “Previz” or “pre-visualization” is a step in production after storyboarding and before final animation where simple models are laid out in 3D space, basic animation is done and camera moves are locked in place. This allows the director to get a better idea of what the final shot will look like before any intensive work is done on the models or the scene. It also allows camera moves to be changed without needed to do extensive rendering.

Lets back up a bit and put this in context.

Our production process has 4 major steps:
1. Scriptwriting & Concept Art
2. Storyboarding
3. Animatics & Voice-over
4. Previz & Sound Effects
5. Final animation & Score

The difficult moment in any film/tv/dome production is how to move from the animatics phase (essentially a flipbook storyboard with scratch audio) to the final animation stage without really knowing what the shot will look like. A good example of this would be a scene in our latest show, Solar System Odyssey. In the scene, our two heros are trying to escape from the radiation belt around Jupiter, which is causing havoc to their ship. This is what the original storyboard/animatic looked like:

As you can see, there was a lot of proposed camera movement in that shot. The difficulty was knowing how much movement would be most effective to make the scene interesting and tense, but not make the audience confused or nauseous. So we took low-poly renders of the characters, did basic animation on them and put them in a basic textured, low-poly environment. This is what it looked like:

By doing the previz stage, we got some great intel back. We realized that the shot felt dead. There was very little tension in the shot with the current camera moves. And since it’s difficult to build tension through editing, like in a flat screen film, we realized that we’d have to make the camera moves more dynamic. We did this by making the moves faster between rest points and adding dutch angles to the pause points. This was the final product:

Previz is becoming very popular in Hollywood, typically with action shots. We already find it an integral part of our process. Not only does it allow us to more clearly visualize the final look of a shot, but it actually speeds up the production process by preventing us from needing to go back and re-tweak an already rendered shot. For a great video about the importance of previz, check out this video about how it’s being used in Hollywood:

A Tribute to the Zeiss Mark VI Star Projector

October 20, 2011

A few months ago, we officially retired the Zeiss Mark VI Star Projector at Morehead Planetarium. We’d had it for 42 years and it served us well. However, the ol’ Zeiss had been getting old and despite the heroic efforts by our Chief Technician, Steve Nichols, to keep it going the decision was made to put ‘er down. And since we added a digital system to our planetarium over a year ago, we were able to just roll on forward. However, before the Zeiss was dismantled, we thought we’d shoot some footage of it and put together a short tribute video. RIP Zeiss.

Solar System Odyssey Trailer!

October 12, 2011

We just released the trailer to our newest show – Solar System Odyssey. We’ll be showing the trailer at ASTC this weekend during SkySkan’s after hours presentation.

Our story takes place far in the future with an Earth on the verge of environmental collapse. Billionaire Warren Trout thinks he can make a fortune colonizing the rest of the solar system and sends space pilot Jack Larson to find out where. But there’s one thing he didn’t count on – Ashley, Trout’s daughter, has stowed away on board the ship and has her own ideas.

Our intention was to make sure the show was filled with science, but to also have an exciting, entertaining story as well. Let us know what you think.

Camera Shake on the Dome

October 12, 2011

One of the things I discovered is that when you want to have a camera shake, normal camera translation doesn’t really work. Hardly any motion is perceived, unless the camera moves enormous distances. What I found to be the most effective approach is to rotate the camera, rather than change its position. This really makes the audience feel uneasy and unbalanced, which is exactly what we want the camera shake to portray.

Here’s an example of it from our new show, Solar System Odyssey.

A fun example of an Immersive Environment!

September 6, 2011

A few months ago I had the pleasure of going to Universal Studios in Orlando, Florida. The Simpsons Ride had recently taken the place of the old Back to the Future ride. Although it was sad to see Marty and Doc gone, The Simpsons Ride was a worthy replacement, and a great example of a fun dome show.

The animation was spectacular, paying good attention to make character actions and poses clear despite being a dome show. If you’re ever visiting Universal Studios, be sure to hop on this ride!

When to fake it, 2D and 3D particles

September 2, 2011

If there is one thing we’ve learned in the past, it’s that particles can be expensive in terms of development, implementation, and hardware resources. Though there are many effects that call for particles, and sometimes using them is unavoidable. A workflow we’ve come to use is a process where we generate a particle system in After Effects using the Trapcode Particular Plugin, then map that image sequence to a plane in 3d to get the look we need without spending hours tweaking a fluid or particle system in Maya.

You could even use the same principle with stock footage of bullet hits and explosions. This process is best for systems that have limited interaction with their environment, and that the camera sees them from a distance. Typically we’ve used this for bursts and explosions, and a few eruptions.

Now for the flat screen this concept of using 2d effects layered over your comps isn’t very new. Applying the idea to the dome world requires you match it by hand using one of the dome plugins for After effects, or map it to Geometry in a 3d Scene and render it with a 5cam stitch or fisheye. These passes generally take little to no time even at a 4k resolution because you’re essentially rendering a simple piece of geometry with a single image texture. The texture files are generally 2k, unless the situation calls for more resolution.