Over the past few weeks, I’ve been fortunate enough to work on the lighting and compositing team for the upcoming animated short, Cauldron Bubbles, a production by Edward Taylor. It’s an exciting introduction to development pipelines and I’m learning a lot. For one, I’m learning to streamline my efforts and reduce the turn-around time for each assignment. We’ve got two excellent supervisors from Reel FX — Nick Shirsty and Ed Whetstone — offering feedback and teaching us the workflow between Autodesk Maya and The Foundry’s Nuke..
I’ve had quite a lot of instruction in Photoshop in my recent education, mostly as a photo editing and compositing solution. I figured the concepts would be similar enough that I could transition to another compositing program with minimal hiccups. Though similar, the concepts and workflows of the two programs – Photoshop and Nuke – couldn’t be more different. Photoshop has a linear, one-dimensional system of layers that is simple yet powerful. Each layer can be masked easily, allowing you to “paint” in the information you want where you want it. The software is mostly bug-free and any errors you face will likely be due to an unseen element in the settings, rather than an actual flaw in the code.
Nuke is in a league of its own. The software offers a two dimensional approach to layering, allowing you to literally send information through “pipes”, arrows indicating the direction and destination of the given content. Instead of layers, you have nodes with pipes leading to and from them, creating a complex network of algorithmic activity. Throw in an OpenEXR with few dozen mattes and passes, and compositing an animated sequence becomes infinitely more workable. When Nuke works, it’s an absolute joy to use, but it’s a pretty cranky beast in some cases. Things you’d assume would work (because they did in Photoshop) won’t necessarily work the same way in Nuke.
For instance, if I wanted to add edge blur in Photoshop, it consisted of me slapping down a mask on a blurred layer and painting a radial gradient with the Brush tool (this in the mask). The white parts show through, the black parts are invisible, resulting in blur around the edges and corners. Simple. Now, Nuke doesn’t have a brush tool that I’m aware of, but it does have a node called Radial. This node creates a white radial gradient just like an inverted version of the image above. However, piping this node as a mask doesn’t have the same effect that painting it in a Photoshop mask would have. I may be wrong, but a mask is an empty alpha layer. When black and white image is used as a mask, the white indicates what is visible and vice versa. Plugging the black and white Radial node into the Alpha mask of another node should result in the white dictating the connected node’s output, right? But it doesn’t.
One thing’s for sure, stubborn as Nuke is, I need to familiarize myself with her rules. She’s too powerful a program not to.
Well, that was a right and proper rant. Until next time folks.