Following on from yesterday’s post, I want to continue the discussion of why NUKE has revolutionised the industry and is an incredible asset to any compositor. The popularity of NUKE is largely down to its advancements in 3D integration, not only with its own user interface (UI), but also its ability to communicate with other packages such as MARI and AtomKraft. Being acutely punned as NUKE's 2.5D space, the 3D system has sped up the conventional 2D workflow in areas such as paint and roto, as well as depth compositing. This 2.5D space allows users to build rudimentary geometries, which can then be projected on, textured, lit, shaded and rendered as complimentary 3D assets or as part of a standard 2D workflow.
In understanding the relevance and capabilities of the 3D system, I've been keen to explore its feasibility on my MacBook Air. In an effort to grasp how plausible it is to manage a compositing job on a much smaller system, I've decided to follow a small series of Foundry tutorials related to the Camera Tracker Node found here.
And you can find out how I get on in part three of this blog series, which I’ll be posting tomorrow!