Skip to main content

Final Cut now supports 4K and Red camera natively

August 28, 2009

Looks like the new version of Final Cut supports 4K resolution and RED camera natively.

http://www.apple.com/finalcutstudio/finalcutpro/digital-cinema-workflows.html

If we weren’t already using After Effects for our final edit, I’d move us over to Final Cut since we’re doing our sound design with Logic and Soundtrack. But we may soon have access to a RED camera so it’ll be nice to pull in footage to Final Cut for editing. I’d love to hear from people who have used Final Cut for their 4K footage. What do you think?

Eureka! How to stitch alphas

August 19, 2009


We’ve been running into some issues stitching together frames that have varying opacity. Namely, clouds and particle systems. Originally when using a sequence of PNG’s we’d find ourselves having a seam around the stitched boarder. This was due to the alpha being added together at the seam line creating a 1-2 pixel boarder that had a combined opacity greater than the pixels around it.

badseams2

I realized the problem came with having the stitching software not being able to understand the alpha channel, and that if I controlled that myself rather than leaving it to the code I could remove this variable from the equation. So by out putting an opaque color pass and an opaque alpha pass I could use one to cut out the other as a luma matte in after effects.

opaque_color1

opaque_alpha1

aftereffects_menu1










 

Thus, removing the seem issues, and having an alpha channel that could be independently manipulated.

noseams1

True this creates more files, but really doesn’t increase render time, as the alpha information is calculated in a render anyway and either mixed into a 32 bit frame, or simply discarded in a 24 bit frame. Though if you select Alpha Split in the .tga File set up when outputting, rather than discard the information it will save it as “A_[filename].tga” giving you the two opaque frames you need for stitching.

alpha_splitsetup

 

Hope this is helpful, I know for us this is a great discovery, and kind of a “why didn’t I think of that before”, moment. I also realize that stitching isn’t the best solution, but sometimes is necessary.

Classic Film Techniques

August 13, 2009

In our conversion of The Magic Tree House, there is a sequence of shots that the visuals are being re-done. One part of that sequence is when we are on the surface of Mars following the Sojourner rover, but we ran into a hitch. There were two goals we were wanting to achieve for this section, which is to have the rover exit from the lander, and to end with an impression of the rover exploring the surface of Mars. Since the audio commentary was to remain unchanged, we were fairly constrained in what options we had to visually tell the story. To keep the number of shots to an absolute minimum so we could fit it in the already predetermined sequence length, we had to look to using some film techniques we weren’t sure would translate to a dome.

Needing to show passage of time to make sense for the following shot of the rover driving off into the martian sunset, we lowered the sun over a series of dissolves, while still keeping the same camera dolly in.  The reason we felt it’d translate well for the dome is that with the continued motion forward we can continue having parallax motion with the rocks and boulders to show distance, and the growing length of shadows combined with the sky’s hue and saturation change, can help to really create some immersion. Check out the video below:

shot04


The Rush to Increase Resolution

August 5, 2009

I was reading the fulldome yahoo listserv today (the “My farm’s bigger than yours” string) and saw that a couple people mentioned producing for 8K systems. Wow. Already? Hmmmmm. I’m wondering if we’re jumping the gun a bit.

Now, for a minute forget about the technical issues, like the fact that After Effects can’t easily handle anything larger than 4K and that we’d need a render farm 4x bigger than our current one to handle the processing. After all, we’ve got Moore’s law working for us and sooner than later, the hardware and software will catch up.

What I’m wondering is will the average Joe Planetarium visitor appreciate the difference? After all, 4K looks great and I even think 2K looks pretty damn good on a large dome. And being part of the industry, I’m probably much more discriminating than 99% of the general public out there.  I haven’t yet seen any 8K demos or been to any of the installations that Sky-Skan has done in China but I’ve been assured by Steve Savage over at Sky-Skan that it looks phenomenal and that even 4K content looks better on a 8K system (which I don’t really understand). And yes, it is supposed to be rivaling the image quality of large format 70mm film. So OK, maybe it’ll look fantastic and we’ll sit back and marvel at our own magnificence.

However, think about this – in that same string on the fulldome listserv, Paul Mowbray over at NSC Creative mentioned that their “Centrifuge” scene in Astronaut was “rendered at 2400×2400 and then scaled up to 3600×3600” and it still looked amazing on a large dome with a 4K system. In fact, it looked good enough that it picked up a Domie at the 2008 DomeFest.

He also said this, “… don’t get caught up with pure resolution” …. “4k doesn’t = high quality. If you have a big enough render farm/budget/time/patience then the higher res the better but at the moment very few domes can even show 4k so by the time they can you’ll probably be making a new show so in the meantime focus on the content itself.”

If we spent as much time worrying about storytelling and compelling content as we do about resolution, we’d have a lot more people excited about going to their nearest dome.

Centrifuge – ASTRONAUT – Fulldome from NSC Creative on Vimeo.

Live Action for dome’s sake.

August 3, 2009

I’m going to discuss some potential issues I’ve been mulling over about blending live action and cg on a dome. Following links will discuss in further detail some of the terms I may be using.
Chroma Keys (Aka, Green Screen)
Match Moving

Generating live action footage for a dome has been an on going challenge for anyone producing content larger than 2k. The current resolution standards on most HD cameras only allow us to create the bottom have of a 4k fisheye master. This means of course that part, if not all, of the environment that live actors interact with will need to be computer generated. Also shooting live action, you’re somewhat limited to how much motion you can incorporate into a shot.

The challenge of shooting a moving camera shot, is needing to match that motion in the digital 3d world. You’ll need to be able to record the camera’s position and orientation for each camera move, and replicate it so that your filmed and separated actor/actors are rooted to the scene. You could achieve this using a motion control rig that the camera sits on. With every take you can program the camera’s move so that human error is removed from the situation. The downside is the cost of renting and operating such equipment can be excessive.

Another option is to try syncing the camera up using some match move software and tracking markers. Though most of the software has been developed to track xyz positions in relation to a single plane of footage, and has yet to be calibrated for working with the unique distortion of a fish-eye lens. A work around would be to lock down the camera during filming and then move the actors image in 3d, but would be limiting in its ability to recreate complex camera moves.

Hopefully as Fulldome video becomes more mainstream, camera companies will develop the hardware that will make live action a more plausible solution for smaller studios. The benefits of using real actors, and building on existing sets, leads to a more believable experience for audiences. It also makes production a little simpler because practical solutions can be generated rather than leaning everything on being created in post.

A New Jack for Magic Tree House

July 29, 2009

mth_scriptLast week we spent about 4 hours in the trenches at Trailblazers Studios in Raleigh rerecording the voice of Jack for our latest production – Magic Tree House: Space Mission. Will Osborne, who wrote the script, worked with 11 year old Blake Pierce to bring out a newer, slightly older sounding Jack that will more accurately portray what Jack would really sound like. It took Blake a few minutes to get warmed up and comfortable (it was his first official acting job, after all) but with Will’s help and great demeanor, Blake morphed into Jack before our eyes and busted out his lines like a pro.

record

My God, it’s full of stars!

July 27, 2009

The camera rig we’ve used for Earth, Moon, and Sun has undergone a slight change for the Magic Tree House. We use a 5 camera setup, with Sky-Skan’s DomeXF proprietary plugin for AfterEffects to stitch all the cameras together. From what you can see in the picture, we got all the cameras pointed the correct directions with appropriate Angle of View. We went ahead and included use background shaded planes for the appropriate cameras so we don’t have to render stuff we won’t see. Sometimes however we need to blur certain elements in which case we would turn off those planes so that there won’t be a feathering on the master frames. In order to manipulate these cameras to where we need them to be, without letting them drift independently, a supermover holds the group node of all 5 cameras. A nurbs arrow shows what direction front is.

5camsetup

Now, this camera setup is what we used for Earth, Moon, and Sun. With EMS, we’d have our starfield referenced in seperately, but this time for MTH we decided to just combine the two and make it easier on our end. One of the first things I learned about astronomy since working at Morehead is that when you move from one planet to another in our solar system, there is hardly any star movement, if any at all. Essentially the stars are locked in space, and only move when we’re rotating the camera around. Since we didn’t want our starfield to shift in space, we applied a point constraint to keep the stars in the same position, relative to where the cameras are, but would also allow us to rotate the stars to accurately reflect where we are.

5camsetupstars

Quick Cuts on the Dome?

July 22, 2009

red_ceremonial_scissorsQuick cuts on the dome? Within a scene? MTV style!? Heeeeeellll no. Are you crazy?

Right?

Well, that seems to be the current golden rule of dome production. Quick cuts or moving from a wide to medium to close shot would kill the immersiveness of the dome environment. It would also be too jarring for the viewer. So everything lumbers along slowly and epic-ly. Don’t get me wrong, I like the epic reveal of the sun cresting above the earth as much as the next person. We’re actually doing a couple of those shots in our current production.

But are we locked into this medium-shot, slow camera pan or push with all of our scenes?  It’s visually tedious. Coming from the flat screen world, we want to cut. Cutting allows the viewer of a flat screen to see the entire environment – something that’s not necessary with a dome. But it also creates tension, builds emotion and gives some much needed visual variety.

Has anyone experimented with this? Are there any good examples out there of why it absolutely doesn’t work?

Lunar City

July 21, 2009

We’re reworking a shot from an old show we’ve been commissioned to convert to the full dome platform. We’re seeing what a colony on the moon may look like. Rather than go with something that is the equivalent of the MIR space station on the moon. I thought something much more fantastic, like a full city on the moon may be more inspiring to the younger audiences who see the show. Its still a work in progress, but its come a long way.

The city is equipped with its own fleet of touring taxi’s.