Skip to main content

What does it take to make a fulldome show?

October 22, 2009

wall-e-poster1-big copyIt takes a little more than grit and perseverance, although that probably helps. In the analog days, at least at Morehead, planetarium shows were put together using about 60 slide projectors, 3 video projectors, a slew of opti-mechanical do-dads, a computer to time everything out and a huge star projector that sat in the middle of all of it. The production staff consisted of two people, an outside contractor to do some artwork and a music composer.

Now that we’re in the process of going digital, it will be very different within the planetarium dome itself. The plan is to have two huge projectors that will project a 4000×4000 pixel image onto the screen. To put that in perspective, it’s roughly 8 times bigger than High Definition television. But the production staff is fairly similar. We’ve now got a producer, a director, two main animator/compositors/creative directors, the same music composer we used for the old shows, support from the Morehead staff and others here and there.

It takes this 4-5 person crew anywhere from 9-15 months to create a 3D animated 25 minute dome show, depending on the content and situation. To put that in perspective, it took Pixar up to four years, at least 400 people and $180 million to make Wall•E. Compared to that, we’re definitely coming in under budget.

Explaining Fulldome without being in the dome.

October 5, 2009

It’s always hard to explain exactly what the fulldome experience is like without being there to see it in all its 4K glory. Especially if the person you’re talking to has never been in a planetarium, or as in some cases, has never heard of a planetarium (yes, those people are out there). xRez Studio, the team that produced “Crossing Worlds,” which won the Domie this year for best immersive world design, just put up a interactive video pano in which the user can move around a virtual dome space while a full dome show is playing. It may help in our collective struggle to explain to the uninitiated exactly what it is we do. You can check it out here – http://www.xrez.com/cw_video_pano/ – just click and drag to pan around and scroll to zoom in and out. Also, check out Crossing Worlds.


Adventures in Z-Space

September 28, 2009

When filming in a traditional flat screen medium,  one may use a variety of lenses to create certain dramatic effects. They can accompany these lenses with zooms and dolly moves to create the Oh-so-dramatic Zolly, where the characters world shifts around them. This video I found explains it pretty well, even though its a bit cheesy in style.

What’s covered are some classic filming techniques, but how can we translate them to the dome.

Unlike a window where you really only have one direction of z-space to sell, a dome is 360 degrees of z space. The viewer is fixed in the middle of a scene. In order for the environment to be correctly projected  on a dome, we’re stuck using only one lens setting, and can’t exactly zoom, because that would actually translate into a camera move.

1zdepthMultidirectionalZdepth




Everything is based upon the 3d cameras proximity and placement within a scene, and its field of view has to remain a constant. The filming language we’ve grown to accept without realizing is subtle, and full of nuance.  The dome world is still building a shooting vocabulary, let alone a well developed visual language.

Affordable Black hole

September 1, 2009

In production we know that Particles are just darn expensive. They require a lot of meticulous editing, and a massive amount of time to render.

We had the challenge to visualize a Black hole, but had to do it in about a 2 week period.  So of course doing a scientifically accurate simulation using particles and immensely complex equations to describe the physics of a theoretical object for a kids show was a little out of the question. I instead went with using animated textures and alphas, on solid geometry to create an artistic representation.

Blackhole_grey copyBlackhole_wire copyBlackholeBlackhole_Paddimage

Working with our content expert, we reached a comfortable compromise and the final product is equally beautiful and terrifying as a result.

Morehead planetarium black hole test from Peter Althoff on Vimeo.

Now this is an example of keeping the target audience in mind. We know that this show is intended for children and families so it gives us some flexibility. Generally the public isn’t going to be all that concerned, or more importantly, notice a difference between something artistically visualized vs accurately simulated. If we were trying to generate something for scientific minds to analyze, we might not have gone this route.



Final Cut now supports 4K and Red camera natively

August 28, 2009

Looks like the new version of Final Cut supports 4K resolution and RED camera natively.

http://www.apple.com/finalcutstudio/finalcutpro/digital-cinema-workflows.html

If we weren’t already using After Effects for our final edit, I’d move us over to Final Cut since we’re doing our sound design with Logic and Soundtrack. But we may soon have access to a RED camera so it’ll be nice to pull in footage to Final Cut for editing. I’d love to hear from people who have used Final Cut for their 4K footage. What do you think?

Eureka! How to stitch alphas

August 19, 2009


We’ve been running into some issues stitching together frames that have varying opacity. Namely, clouds and particle systems. Originally when using a sequence of PNG’s we’d find ourselves having a seam around the stitched boarder. This was due to the alpha being added together at the seam line creating a 1-2 pixel boarder that had a combined opacity greater than the pixels around it.

badseams2

I realized the problem came with having the stitching software not being able to understand the alpha channel, and that if I controlled that myself rather than leaving it to the code I could remove this variable from the equation. So by out putting an opaque color pass and an opaque alpha pass I could use one to cut out the other as a luma matte in after effects.

opaque_color1

opaque_alpha1

aftereffects_menu1










 

Thus, removing the seem issues, and having an alpha channel that could be independently manipulated.

noseams1

True this creates more files, but really doesn’t increase render time, as the alpha information is calculated in a render anyway and either mixed into a 32 bit frame, or simply discarded in a 24 bit frame. Though if you select Alpha Split in the .tga File set up when outputting, rather than discard the information it will save it as “A_[filename].tga” giving you the two opaque frames you need for stitching.

alpha_splitsetup

 

Hope this is helpful, I know for us this is a great discovery, and kind of a “why didn’t I think of that before”, moment. I also realize that stitching isn’t the best solution, but sometimes is necessary.

Classic Film Techniques

August 13, 2009

In our conversion of The Magic Tree House, there is a sequence of shots that the visuals are being re-done. One part of that sequence is when we are on the surface of Mars following the Sojourner rover, but we ran into a hitch. There were two goals we were wanting to achieve for this section, which is to have the rover exit from the lander, and to end with an impression of the rover exploring the surface of Mars. Since the audio commentary was to remain unchanged, we were fairly constrained in what options we had to visually tell the story. To keep the number of shots to an absolute minimum so we could fit it in the already predetermined sequence length, we had to look to using some film techniques we weren’t sure would translate to a dome.

Needing to show passage of time to make sense for the following shot of the rover driving off into the martian sunset, we lowered the sun over a series of dissolves, while still keeping the same camera dolly in.  The reason we felt it’d translate well for the dome is that with the continued motion forward we can continue having parallax motion with the rocks and boulders to show distance, and the growing length of shadows combined with the sky’s hue and saturation change, can help to really create some immersion. Check out the video below:

shot04


The Rush to Increase Resolution

August 5, 2009

I was reading the fulldome yahoo listserv today (the “My farm’s bigger than yours” string) and saw that a couple people mentioned producing for 8K systems. Wow. Already? Hmmmmm. I’m wondering if we’re jumping the gun a bit.

Now, for a minute forget about the technical issues, like the fact that After Effects can’t easily handle anything larger than 4K and that we’d need a render farm 4x bigger than our current one to handle the processing. After all, we’ve got Moore’s law working for us and sooner than later, the hardware and software will catch up.

What I’m wondering is will the average Joe Planetarium visitor appreciate the difference? After all, 4K looks great and I even think 2K looks pretty damn good on a large dome. And being part of the industry, I’m probably much more discriminating than 99% of the general public out there.  I haven’t yet seen any 8K demos or been to any of the installations that Sky-Skan has done in China but I’ve been assured by Steve Savage over at Sky-Skan that it looks phenomenal and that even 4K content looks better on a 8K system (which I don’t really understand). And yes, it is supposed to be rivaling the image quality of large format 70mm film. So OK, maybe it’ll look fantastic and we’ll sit back and marvel at our own magnificence.

However, think about this – in that same string on the fulldome listserv, Paul Mowbray over at NSC Creative mentioned that their “Centrifuge” scene in Astronaut was “rendered at 2400×2400 and then scaled up to 3600×3600” and it still looked amazing on a large dome with a 4K system. In fact, it looked good enough that it picked up a Domie at the 2008 DomeFest.

He also said this, “… don’t get caught up with pure resolution” …. “4k doesn’t = high quality. If you have a big enough render farm/budget/time/patience then the higher res the better but at the moment very few domes can even show 4k so by the time they can you’ll probably be making a new show so in the meantime focus on the content itself.”

If we spent as much time worrying about storytelling and compelling content as we do about resolution, we’d have a lot more people excited about going to their nearest dome.

Centrifuge – ASTRONAUT – Fulldome from NSC Creative on Vimeo.

Live Action for dome’s sake.

August 3, 2009

I’m going to discuss some potential issues I’ve been mulling over about blending live action and cg on a dome. Following links will discuss in further detail some of the terms I may be using.
Chroma Keys (Aka, Green Screen)
Match Moving

Generating live action footage for a dome has been an on going challenge for anyone producing content larger than 2k. The current resolution standards on most HD cameras only allow us to create the bottom have of a 4k fisheye master. This means of course that part, if not all, of the environment that live actors interact with will need to be computer generated. Also shooting live action, you’re somewhat limited to how much motion you can incorporate into a shot.

The challenge of shooting a moving camera shot, is needing to match that motion in the digital 3d world. You’ll need to be able to record the camera’s position and orientation for each camera move, and replicate it so that your filmed and separated actor/actors are rooted to the scene. You could achieve this using a motion control rig that the camera sits on. With every take you can program the camera’s move so that human error is removed from the situation. The downside is the cost of renting and operating such equipment can be excessive.

Another option is to try syncing the camera up using some match move software and tracking markers. Though most of the software has been developed to track xyz positions in relation to a single plane of footage, and has yet to be calibrated for working with the unique distortion of a fish-eye lens. A work around would be to lock down the camera during filming and then move the actors image in 3d, but would be limiting in its ability to recreate complex camera moves.

Hopefully as Fulldome video becomes more mainstream, camera companies will develop the hardware that will make live action a more plausible solution for smaller studios. The benefits of using real actors, and building on existing sets, leads to a more believable experience for audiences. It also makes production a little simpler because practical solutions can be generated rather than leaning everything on being created in post.

A New Jack for Magic Tree House

July 29, 2009

mth_scriptLast week we spent about 4 hours in the trenches at Trailblazers Studios in Raleigh rerecording the voice of Jack for our latest production – Magic Tree House: Space Mission. Will Osborne, who wrote the script, worked with 11 year old Blake Pierce to bring out a newer, slightly older sounding Jack that will more accurately portray what Jack would really sound like. It took Blake a few minutes to get warmed up and comfortable (it was his first official acting job, after all) but with Will’s help and great demeanor, Blake morphed into Jack before our eyes and busted out his lines like a pro.

record