VFX

CG

3 minute read

Not too long ago, if a character, vehicle, or environment made the leap from artist’s imagination to the real world, it was planned, sculpted, painted, posed, lit, and shot on film. Filmmakers have partnered with technologists to augment, and in many instances replace, the traditional art disciplines in the pipeline with Computer Graphics (CG), or Computer Generated Imaging (CGI). How?

Let’s examine at the anatomy of a computer-generated asset.

Modeling

In the modeling phase, a shape, or combination of shapes, is transformed into something recognizable. Instead of molding and carving clay, plastic, or resin, polygons are defined (then refined) in software. Lines and facets are divided and re-divided to form smooth curves and surfaces. Shapes are combined to add, or take away, mass from an object. With the right time, budget, and skill, the proverbial block of marble can become the most necessary version of David.

Lighting

You can also create light sources that mimic real lights in our physical world. Properties such as intensity, color temperature, and fall off are available and configurable. Lights that serve as key, fill, rim, bounce, and kick lights can be placed anywhere you want. Create as many lights as your shot calls for. The rendering engine, a specialized software package that generates your image, may provide additional features for shaping and refining the lights in your scene.

Cameras

Virtual cameras can be generated and placed anywhere in the scene. Unlike a physical set, you can also create as many as you need, each with different focal length or lens emulation options. Need a dolly shot? A crane shot? A handheld? You can animate the motion of any of these cameras, perhaps placing all three in the same scene without worrying about tripping over dolly track or electrical.

Rigging

If modeling creates the body of an object, rigging creates the skeleton. Rigs change a model’s shape at different points of articulation. This is called deformation. You can specify how a rig deforms a particular area in a model, such as the range of motion in an elbow or knee.

Shaders

What color is this thing? Is it shiny, or dull? Transparent? Translucent?

How does it feel? Rough? Smooth? Does it have a repeating pattern? A random one? A bit of both?

Questions like these are answered with shaders. Color, texture, and specular highlights are just a sampler of some of the attributes you can define when designing and applying shaders to a model.

Rendering

Rendering takes all of this data – the modeling, lighting, cameras, rigging, and shaders – and generates an image. It may also generate a matte (alpha channel) for compositing.

Rendering engines are specialized pieces of software that create these images in their own ways. Some are known for more photorealistic output. Others for style. Others for ease of use. All of them are known for one thing: time consumption. Being aware of you and your teams hardware and software capabilities will guide you when scheduling time to create the shots that will eventually replace any storyboards in your timeline.

Continue readingfor free

Unlock all 100,000 words of the Frame.io Workflow Guide and learn how the pros do workflow.

Workflow Guide

Read the full guide

Presented By:

Video collaboration solved.