Edit

Pitfalls of an offline workflow

12 minute read

The online-offline workflow is robust and efficient, but it has lots of moving parts. In practice, that means that you can fall into a number of technical and procedural traps, which will slow your work down considerably.

If you are using an online-offline workflow for your project, here are some pitfalls to avoid.

[Note, many of these examples are illustrated with DaVinci Resolve and Premiere Pro, but the principles should translate to any other NLE.]

Pitfall #1: Not using timecode

Timecode is the glue that holds an online-offline workflow together. While Premiere Pro is relatively good at keeping track of clips that don’t have timecode, Resolve more or less requires it. While Premiere Pro can assign a kind of “fake” timecode inside of a Premiere Pro project, it won’t translate properly into Resolve via an XML.

A typical online-offline workflow involving Premiere Pro and DaVinci Resolve consists of:

  1. 1
    Preparing Proxies in Resolve
  2. 2
    Doing the offline editing inside Premiere, with the timeline referencing offline media
  3. 3
    Kicking out an XML from Premiere
  4. 4
    Importing a folder of original digital negatives into a Resolve project
  5. 5
    Relinking the imported timeline from the XML to the online clips inside the Resolve project

By default, without timecode, that timeline inside Resolve will have pretty much no way of identifying the proper in and out points for each clip in the timeline. Without timecode, Resolve might be able to place the clips into the sequence at the correct sequence in and out points, but there’s no guarantee that the in and out points for the clips will be correct. Without timecode, each clip’s in and out points within the sequence won’t make it into Resolve. Without timecode, a clip’s in and out points will be arbitrary.

Not all formats have timecode. The MP4 container, a very common container for H.264, is not even capable of containing timecode. This can be a bit confusing, because in many post-production apps themselves, the clips themselves can display a count that looks like timecode, even if it’s not actually timecode. It’s important to just be familiar with which formats can and cannot contain timecode.

Timecode is not just the particular count on your media that shows up in the Source Monitor. Just like your clip has a video stream composed of still images in a particular sequence, and an audio stream composed of audio samples in a particular sequence, any professional format that contains SMPTE-compliant timecode has an ancillary timecode stream that assigns each frame a particular hour, minute, second, and frame, according to the SMPTE 12M-2 standard.

ProRes files wrapped in their native MOV container, or DNxHR files wrapped in MOV, MXF OP1a, or MXF OP-Atom, all contain timecode. Different camera manufacturers will list in their technical specifications whether or not their camera formats contain SMTPE-compliant timecode.

In Premiere, you can check the properties of a format in the QuickTime container, and it’ll show you that there’s a timecode stream.

As long as the proxies you transcode out of Resolve have timecode that matches the timecode in the original digital negatives from the camera, each post-production application should properly be able to switch back and forth between the online and offline formats as they’re needed.

Pitfall #2: Not having an organized folder structure

One of the most important parts of mastering an online-offline workflow is to make sure that the folder structures for the Original Camera Negatives (OCNs) and the proxies are identical, and that those folders are tightly organized.

The reasoning for having an organized folder structure is more profound than just keeping the online-offline workflow tamed. It’s pertains to the same deeper reason that you’re using an online-offline workflow in the first place. Having an organized folder structure and using an online-offline workflow are two methods that allow for many disparate collaborators to seamlessly work on a project quickly and efficiently.

Let’s say you have two different collaborators that are separated by geography—this might be a director and editor who are in different cities, or two editors working on different deliverables, but who require the same source material.

If their folder structure is well-organized and identical, one collaborator can quickly direct the other to any particular shot. Both parties can recall the asset instantly. If different collaborators just start throwing their own assets into their own folder structures ad hoc, then projects will stall when collaborators waste time merely trying to recall particular assets.

Not Having Unique Folder Names for Camera Rolls

It’s important that a folder structure separates out every single camera memory card by a unique name. A common convention is to give every different camera on a particular project to get its own letter. We then count for each lettered camera, in the order that the footage was shot.

So, if we had three cameras, we’d call one camera A, one camera B, and one camera C. If we shot 5 cards on A, 2 on B, and 4 on C, the full list of camera cards would be:

  • Camera A: A001, A002, A003, A004, and A005
  • Camera B: B001 and B002
  • Camera C: C001, C002, C003, and C004

This is one method to help alleviate the problem of camera files that might have name collisions. Even if you’re not able to rename such problem files, having each camera card with its own unique name means that if you need to go digging into an XML or a project file to check what the last good file path of a file might have been, you would quickly be able to see the difference between /A003/Clip001.mov and /F002/Clip001.mov.

Not separating camera cards by date

For documentary work, wherein you might have directors or producers trying to remember something that they shot on a particular date, it’s best not only to have each card separated out by a unique camera letter and number, but also by date.

So the folder structure at Freethink for a folder of camera originals and proxies has been standardized to look something like:

Original Digital Negatives

20180723

  • 20180723_Sound
  • A001
  • A002
  • B001
  • B002
  • C001

20180724

  • A003
  • A004
  • B003
  • B004
  • C002

Dates are in ISO 8601 format, so that the filesystem lists them in order, chronologically, by default.

When a director is collaborating with an editor who wasn’t on the shoot, the director can refer to footage in the natural way the human mind remembers it: “Hey, so we had a great shot of this on Tuesday morning…”

Pitfall #3: Not waiting until picture lock to conform

The online-offline workflow is designed to optimize work for each individual kind of collaborator. The offline editor gets color-corrected intraframe files at a low data rate, synced to sound, so that they don’t have to worry about syncing, color management, etc. The offline editor should just be able to play footage and set the order of the clips.

Broadly, the offline editor’s deliverable is a recipe for how particular clips should be arranged on a timeline. That deliverable might technically be a .prproj file, or an XML, or an EDL, but the format doesn’t matter. The goal is for the editor to hand off the particular recipe for the particular editing decisions they’ve made.

Online collaborators have very different goals, though. Colorists, VFX artists, sound editors, and sound designers aren’t finessing the exact timing of particular cuts—in fact they’re taking many precautions to ensure that the cuts on the piece they’re working on don’t change at all. It would be a dire technical mistake for someone in the online editing process to make a timing change to any particular clip.

Given how online applications are designed, once a piece has been conformed for color grading and sound mixing, even minor tweaks to the timing of edits can quickly spiral into time-consuming headaches.

Imagine, for instance, if you’ve already conformed a picture-locked cut, and it’s been sent out for color, VFX, and sound. Let’s say a director or producer wants to just tweak something “simple,” and just wants to swap one particular shot for another.

For the uninitiated, this seems like it would be extremely easy to do, because in an editing session, and editor could do this in a matter of seconds. However, if this piece has already gone to other collaborators, such a request creates a time-wasting headache compounded across multiple people. This headache would ensue for an assistant editor:

  1. 1
    The director, editor, and/or producer would need to sit down and specify the exact sequence timecodes for what needs to change.
  2. 2
    The AE would go grab the replacement clip and upload it.
  3. 3
    The colorist would need to be informed and given the exact in and out points on the timeline.
  4. 4
    The shot will need to be sent to the colorist—hopefully the colorist hadn’t already spent a lot of time on that shot, because any such work will now have been for naught.
  5. 5
    The VFX artist would need to be informed and given the exact in and out points on the timeline.
  6. 6
    The shot would need to be sent to the VFX artist—hopefully the VFX artist hadn’t already spent a lot of time on that shot, because any such work will now have been for naught
  7. 7
    The sound editor would need to be informed and given the exact in and out points on the timeline.
  8. 8
    The audio for the clip would need to be sent to the sound editor—hopefully the sound editor hadn’t already spent a lot of time on that shot, because any such work will now have been for naught.
  9. 9
    The sound designer would need to make sure that any sound design elements are swapped accordingly–hopefully the sound designer hadn’t already spent a lot of time on that shot, because any such work will now have been for naught.

Are you seeing the pattern and problem? Making timing changes after picture lock is a great way to turn what would have been a 5-second task in the offline edit into up to an hour or more of wasted time across different collaborators.

Alexis Van Hurkman, author of the Color Correction Handbook and the DaVinci Resolve manual, says in his book that “locking the edit should not be looked at as a technological limitation but as a scheduling milestone. Sooner or later, the director and producer will have to make up their minds, be done with the edit, and allow the project to go through finishing.

Pitfall #4: Not having clips with unique file names

When swapping the offline media in Premiere with the online media into DaVinci Resolve, the filename itself is one of the primary ways that the app relinks to a clip. As the application traverses a specified directory recursively, it’s looking for particular filename matches.

Now, many “prosumer” cameras like DSLRs, GoPros, and drones simply aren’t able to keep track of clip numbering across card changes. In professional RED, Canon, Sony, and Arri cameras, the numbering of a new card will pick up from whatever the last clip of the previous card was numbered; but in a “prosumer” camera, every time a card is inserted and formatted, the numbering restarts. This could result in a single day’s worth of footage that has 10 different clips all named “C000,” 10 clips named “C001,” 10 clips named “C002,” et cetera.

One way to avoid this would be to stick to cameras that can keep track of numbering across cards,—but with tight budgets, that’s not always possible.

Another solution would be to simply rename the clips so that every single one in the entire project is unique. Sony Catalyst Prepare can rename camera files as it ingests, while properly keeping all the sidecar metadata intact for Sony formats. The Mac OS Finder has a handy built-in function to batch rename files.

Pitfall #5: Not using the right offline format

If you’re going to go through the whole process of preparing proxies for the offline editor, wherein you’re syncing audio, color correcting from log to Rec. 709, and meticulously keeping your folder structure for the proxies identical to the folder structure of the camera originals, you might as well use the correct format.

One of the primary reasons to use proxies in the offline edit is to make those proxies portable and easy to work with. Selecting a proper format designed for this is crucial.

A good proxy format accomplishes two objectives:

  1. 1
    Relatively low data rate
  2. 2
    Easy to serve up frames—typically by using only intraframe compression and entirely avoiding interframe compression

A format with a low data rate will allow for more footage on fewer hard drives, and intraframe compression will allow the footage to be played on low-powered systems without very powerful CPUs or GPUs.

Remember, these proxies are just for the editor to figure out the timing of the edits; the format is not meant to preserve the image fidelity. Preserving image fidelity will be handled later when conforming the cut to the camera originals or a suitable digital intermediate format in preparation for the final color grade.

Two good options for proxies are Apple ProRes 422 Proxy or DNxHR LB.

Pitfall #6: Not using data burn-ins

The tightly-organized folder structure is one part of the workflow that allows for quick recall among different collaborators, but it’s not the only part. Data burn-ins on the proxies are also crucial. (Data burn-in’s are the various metadata and/or timecode information related to a clip burned onto the image.)

The folder structure discussed above can allow a collaborator to identify particular shots, camera cards, and shoot days, but it’s also important to include data burn-ins on those proxies as well.

Freethink’s practice is to include data burn-ins on the proxies that include:

  • Clip name
  • SRC TC [Resolve’s term for the embedded timecode of the video track]
  • AUD TC [the embedded timecode of the audio track, in case the production sound mixer didn’t jam sync perfectly, or if something went wrong with the timecode in the camera]
  • Date [the date the clip was shot]
  • Card [the camera card assigned, i.e. “A001”]

With data burn-ins on all the proxies, any collaborator can quickly call up a shot in the Source Monitor, and within seconds, navigate to any frame in a whole project. This scales down to a short commercial that might only have have an hour of footage all the way up to a documentary that might have hundreds of hours of footage.

Pitfall #7: Not understanding sizing between the online and offline apps

For many editors, it can be confusing to understand the relationship between how different programs treat the reframing of clips. Pulling an XML from Premiere Pro into DaVinci Resolve can be frustrating, because the resulting positioning data is often wildly inconsistent; some shots will match up perfectly, but others will be way too small, way too big, or be placed completely wrong within the frame.

Turns out there is a secret. There are particular combinations of settings across both applications, which, if you rigorously adhere to, will allow you to consistently and reliably transfer your repositioning data out of Premiere Pro and into DaVinci Resolve via the XML.

Instead of spending hours and hours going shot by shot in Resolve and fixing positioning errors, you can now trust your shots to come into Resolve with proper placement.

This will obviously vary if you’re using other NLEs like FCP X or Media Composer, but the lesson here is that you can, and should, systematically run your own experiments so that you can figure out a reliable combinations of settings that work for you.

Pitfall #8: Not syncing audio losslessly

Recall that the purpose of the proxy workflow is organizational—to allow for each individual collaborator to get exactly what they need to get to work creatively. How does audio fit into this?

For the process of syncing audio and then rendering your proxies, it’s best practice to take the interleaved [a.k.a. polyphonic] WAV files from the production sound mixer and losslessly rewrap the files into the proxies.

It’s tempting to transcode to a lossy, compressed audio format like AAC, but it doesn’t really accomplish anything. By losslessly rewrapping the uncompressed audio, not only will the offline editor have access to each individual microphone’s own ISO track, but when it comes time to conform the sound files for sound mixing and sound design, there won’t be any additional work in tracking down the original files from the production sound mixer’s field recorder.

Relative to video files, audio files—even uncompressed audio files— are tiny, so transcoding into a lossy format like AAC serves no purpose other than to create pointless additional work and to waste time. If you’ve already set up your storage to be able to handle video files, you should still have plenty of storage space for uncompressed audio.

Within the Avid suite, there are indeed tools to go fetch the original field recorder files, but if you just include all of the uncompressed iso tracks, you can skip that step altogether.

Conclusion

The online-offline workflow, when properly implemented, is robust and has been serving filmmakers well for about a century. The online-offline workflow enables collaboration and gains from the division of labor. However, in the modern age, with file-based workflows, we have to keep track of much more than physical reels of film. As you craft and refine the online-offline workflow for your next project, heed these warnings, and don’t fall prey to these mistakes. Your collaborators involved in the process will thank you for keeping everything running smoothly and efficiently.

Workflow Guide

Read the full guide

Presented By:

Video collaboration solved.