Skip to main content

Adventures in Z-Space

September 28, 2009

When filming in a traditional flat screen medium,  one may use a variety of lenses to create certain dramatic effects. They can accompany these lenses with zooms and dolly moves to create the Oh-so-dramatic Zolly, where the characters world shifts around them. This video I found explains it pretty well, even though its a bit cheesy in style.

What’s covered are some classic filming techniques, but how can we translate them to the dome.

Unlike a window where you really only have one direction of z-space to sell, a dome is 360 degrees of z space. The viewer is fixed in the middle of a scene. In order for the environment to be correctly projected  on a dome, we’re stuck using only one lens setting, and can’t exactly zoom, because that would actually translate into a camera move.

1zdepthMultidirectionalZdepth




Everything is based upon the 3d cameras proximity and placement within a scene, and its field of view has to remain a constant. The filming language we’ve grown to accept without realizing is subtle, and full of nuance.  The dome world is still building a shooting vocabulary, let alone a well developed visual language.

Putting the Typo in Typography

September 21, 2009

Text treatment can be pretty specific for flat screen, and even more so for working in dome space. Its always good practice to steer clear of serifs, as they can be hard to read on the flat screen. Television and Film this is kept in mind a lot, and should also be something to think about when projecting on the dome surface. We’re currently working on the creation of the credits for our conversion show The Magic Tree House, and learned a couple things from the experience. The distance the credits or text can play a big factor in ledgibility. The closer to the camera the more distortion we see. A good rule of thumb we’ve found is that if you keep the size of the text no larger than one of the cameras (a 90 degree section of the 360 dome), there isn’t much distortion. Another big part to remember is that the resolution of the dome can vary significantly from planetarium to planetarium. Although the text might look really nice and crisp in the 4k version, those planetaria that have 1k domes may not be able to make out the text very easily.






FullDome to the masses

September 14, 2009

Often times I find myself being asked, “what do you do for a living?” and it’s never a short answer.

The response is usually, “I design Planetarium shows”. The conversation never stops there.  It’s usually followed by remarks of wonder, and enthusiasm, but never a sense of comprehension. So of course an explanation is needed to further fill out exactly what kind of planetarium shows that it is I design.

Explaining that Planetariums are no longer just planetariums is my first step, and introducing new vocabulary of Full Dome Video, is what follows. I usually explain that if they’ve seen an IMAX show, to imagine being inside the screen rather than looking at it.WorthamIMAX-1

We turn the entire surface of the dome into one large screen that uses modern animation techniques similar to that of the motion picture industry. Which of course makes for even more enthusiasm and excitement, and a little more sense of what it is I do.Print

This concept that Planetariums are no longer just grounded in space science has been something that most science centers, and planetariums are having trouble explaining to the public. We can do anything now, and seeing as this new spectrum of opportunities is wonderful, its equally troubling because the public’s expectations haven’t caught up yet. They come to a Planetarium to sit in the dark and see stars.  So of course “Branding” has become a central focus for newly converted domes.  Terms like SciDome, DigiDome and Dome Theater, are being used to get people to understand its not just a Planetarium anymore, but instead a Full Dome experience.  We here at Morehead are going through the same growing pains, and are currently in process of discovering what our new theater will be called once its upgraded.

As the medium continues to gain ground, and become more widely recognized, this will of course become a problem of the past. I’m excited to think that one day people will happily be able to go down to the science center not knowing what to expect, rather than expecting something they’ve seen before. That having a show about biology, or zoology will be just as excepted as seeing a show on the constellations, or our solar system. Who knows, maybe even one day going to the Planetarium looking forward to catch that new Hollywood blockbuster that has been released on limited dome screens.BATMANSUPERMAN

Maybe that last part is just a nerdy fantasy of being able to see Batman vs Superman on a dome, but a boy can dream.

Affordable Black hole

September 1, 2009

In production we know that Particles are just darn expensive. They require a lot of meticulous editing, and a massive amount of time to render.

We had the challenge to visualize a Black hole, but had to do it in about a 2 week period.  So of course doing a scientifically accurate simulation using particles and immensely complex equations to describe the physics of a theoretical object for a kids show was a little out of the question. I instead went with using animated textures and alphas, on solid geometry to create an artistic representation.

Blackhole_grey copyBlackhole_wire copyBlackholeBlackhole_Paddimage

Working with our content expert, we reached a comfortable compromise and the final product is equally beautiful and terrifying as a result.

Morehead planetarium black hole test from Peter Althoff on Vimeo.

Now this is an example of keeping the target audience in mind. We know that this show is intended for children and families so it gives us some flexibility. Generally the public isn’t going to be all that concerned, or more importantly, notice a difference between something artistically visualized vs accurately simulated. If we were trying to generate something for scientific minds to analyze, we might not have gone this route.



Final Cut now supports 4K and Red camera natively

August 28, 2009

Looks like the new version of Final Cut supports 4K resolution and RED camera natively.

http://www.apple.com/finalcutstudio/finalcutpro/digital-cinema-workflows.html

If we weren’t already using After Effects for our final edit, I’d move us over to Final Cut since we’re doing our sound design with Logic and Soundtrack. But we may soon have access to a RED camera so it’ll be nice to pull in footage to Final Cut for editing. I’d love to hear from people who have used Final Cut for their 4K footage. What do you think?

Eureka! How to stitch alphas

August 19, 2009


We’ve been running into some issues stitching together frames that have varying opacity. Namely, clouds and particle systems. Originally when using a sequence of PNG’s we’d find ourselves having a seam around the stitched boarder. This was due to the alpha being added together at the seam line creating a 1-2 pixel boarder that had a combined opacity greater than the pixels around it.

badseams2

I realized the problem came with having the stitching software not being able to understand the alpha channel, and that if I controlled that myself rather than leaving it to the code I could remove this variable from the equation. So by out putting an opaque color pass and an opaque alpha pass I could use one to cut out the other as a luma matte in after effects.

opaque_color1

opaque_alpha1

aftereffects_menu1










 

Thus, removing the seem issues, and having an alpha channel that could be independently manipulated.

noseams1

True this creates more files, but really doesn’t increase render time, as the alpha information is calculated in a render anyway and either mixed into a 32 bit frame, or simply discarded in a 24 bit frame. Though if you select Alpha Split in the .tga File set up when outputting, rather than discard the information it will save it as “A_[filename].tga” giving you the two opaque frames you need for stitching.

alpha_splitsetup

 

Hope this is helpful, I know for us this is a great discovery, and kind of a “why didn’t I think of that before”, moment. I also realize that stitching isn’t the best solution, but sometimes is necessary.

Classic Film Techniques

August 13, 2009

In our conversion of The Magic Tree House, there is a sequence of shots that the visuals are being re-done. One part of that sequence is when we are on the surface of Mars following the Sojourner rover, but we ran into a hitch. There were two goals we were wanting to achieve for this section, which is to have the rover exit from the lander, and to end with an impression of the rover exploring the surface of Mars. Since the audio commentary was to remain unchanged, we were fairly constrained in what options we had to visually tell the story. To keep the number of shots to an absolute minimum so we could fit it in the already predetermined sequence length, we had to look to using some film techniques we weren’t sure would translate to a dome.

Needing to show passage of time to make sense for the following shot of the rover driving off into the martian sunset, we lowered the sun over a series of dissolves, while still keeping the same camera dolly in.  The reason we felt it’d translate well for the dome is that with the continued motion forward we can continue having parallax motion with the rocks and boulders to show distance, and the growing length of shadows combined with the sky’s hue and saturation change, can help to really create some immersion. Check out the video below:

shot04


The Rush to Increase Resolution

August 5, 2009

I was reading the fulldome yahoo listserv today (the “My farm’s bigger than yours” string) and saw that a couple people mentioned producing for 8K systems. Wow. Already? Hmmmmm. I’m wondering if we’re jumping the gun a bit.

Now, for a minute forget about the technical issues, like the fact that After Effects can’t easily handle anything larger than 4K and that we’d need a render farm 4x bigger than our current one to handle the processing. After all, we’ve got Moore’s law working for us and sooner than later, the hardware and software will catch up.

What I’m wondering is will the average Joe Planetarium visitor appreciate the difference? After all, 4K looks great and I even think 2K looks pretty damn good on a large dome. And being part of the industry, I’m probably much more discriminating than 99% of the general public out there.  I haven’t yet seen any 8K demos or been to any of the installations that Sky-Skan has done in China but I’ve been assured by Steve Savage over at Sky-Skan that it looks phenomenal and that even 4K content looks better on a 8K system (which I don’t really understand). And yes, it is supposed to be rivaling the image quality of large format 70mm film. So OK, maybe it’ll look fantastic and we’ll sit back and marvel at our own magnificence.

However, think about this – in that same string on the fulldome listserv, Paul Mowbray over at NSC Creative mentioned that their “Centrifuge” scene in Astronaut was “rendered at 2400×2400 and then scaled up to 3600×3600” and it still looked amazing on a large dome with a 4K system. In fact, it looked good enough that it picked up a Domie at the 2008 DomeFest.

He also said this, “… don’t get caught up with pure resolution” …. “4k doesn’t = high quality. If you have a big enough render farm/budget/time/patience then the higher res the better but at the moment very few domes can even show 4k so by the time they can you’ll probably be making a new show so in the meantime focus on the content itself.”

If we spent as much time worrying about storytelling and compelling content as we do about resolution, we’d have a lot more people excited about going to their nearest dome.

Centrifuge – ASTRONAUT – Fulldome from NSC Creative on Vimeo.

Live Action for dome’s sake.

August 3, 2009

I’m going to discuss some potential issues I’ve been mulling over about blending live action and cg on a dome. Following links will discuss in further detail some of the terms I may be using.
Chroma Keys (Aka, Green Screen)
Match Moving

Generating live action footage for a dome has been an on going challenge for anyone producing content larger than 2k. The current resolution standards on most HD cameras only allow us to create the bottom have of a 4k fisheye master. This means of course that part, if not all, of the environment that live actors interact with will need to be computer generated. Also shooting live action, you’re somewhat limited to how much motion you can incorporate into a shot.

The challenge of shooting a moving camera shot, is needing to match that motion in the digital 3d world. You’ll need to be able to record the camera’s position and orientation for each camera move, and replicate it so that your filmed and separated actor/actors are rooted to the scene. You could achieve this using a motion control rig that the camera sits on. With every take you can program the camera’s move so that human error is removed from the situation. The downside is the cost of renting and operating such equipment can be excessive.

Another option is to try syncing the camera up using some match move software and tracking markers. Though most of the software has been developed to track xyz positions in relation to a single plane of footage, and has yet to be calibrated for working with the unique distortion of a fish-eye lens. A work around would be to lock down the camera during filming and then move the actors image in 3d, but would be limiting in its ability to recreate complex camera moves.

Hopefully as Fulldome video becomes more mainstream, camera companies will develop the hardware that will make live action a more plausible solution for smaller studios. The benefits of using real actors, and building on existing sets, leads to a more believable experience for audiences. It also makes production a little simpler because practical solutions can be generated rather than leaning everything on being created in post.