6 minute read
Metadata is extra information attached to your source files, which is used to describe those files in finer detail.
For example, have you ever opened a shot from your phone in your favorite image editor? If you head to the Properties or File Info dialog, you’ll see some interesting metadata. Your phone’s make and model, the shot’s exposure and shutter speed, even the location where the photos were taken – all of this extra info is seamlessly folded into every shot on your phone. That’s metadata.
But how does metadata affect postproduction? Let’s sift it into two categories:
Systems use metadata to process files in granular ways. For instance, operating systems use metadata to display common file attributes such as Type, Size, and Date Modified, which can be visually grouped and sorted with the click of a column.
NLE’s and VFX tools also use metadata to expose crucial details about your project’s assets. Is this clip a QuickTime MOV, or MP4? Is the codec a flavor of ProRes, or H.264? What’s the frame rate? Bitrate? Does it have Linear PCM 48 KHz stereo, or no sound at all? All of this metadata is baked into your files to help this system, the host application, do its job.
But not all metadata is for machine-use only. More and more postproduction software environments give you, the human, the tools necessary for applying and wielding metadata in a meaningful way.
Keywords are specific terms or phrases you can attach to each clip. Once you’ve created your own keyword set, it can help you and your team get (and keep) your arms around your project. The good news? You don’t need an exhaustive set of keywords to see the payoff.
For example, keywords can help you stay organized. Some apps generate keywords based on your project’s folder structure when you import media. It’s a rudimentary method of asset tracking, but if you’re following your project’s file and folder naming convention, it’s another breadcrumb trail for your assets’ source of origin. This comes in handy when working with incoming assets from disparate sources.
Keywords also help you look at your footage as ideas, not just as clips in a bin. Names, places, emotions, moments – all of these ideas are candidates for keywords that can quickly become part of your team’s vocabulary.
Once those keywords are available, that metadata becomes the basis for locating any asset in your project. Type some keywords into your NLE or VFX app’s Search function to quickly narrow down the number of clips. Don’t know any of the keywords? Some apps allow you to filter results by selecting keywords with checkboxes and logical conditions. Others let you save those keyword searches or filters for later use. In other words, adding keywords lets you create a custom search engine for your project!
So do you have Interview shots of John Smith, but also shots of John in the Factory as B-Roll? Isolating those ideas, then translating them into keywords, will help you find what you need even faster during the edit.
Project-specific metadata may also come from other sources like cameras or Script Supervisors, in the form of:
If available, these become another way to quickly find and filter project assets so you can keep that momentum going during the edit.
Metadata – useful for systems, but can be meaningful for you. All postproduction apps support system-grade metadata, but more and more are supporting meaningful metadata which allows you to organize and sift through hours of footage in mere seconds. When used effectively, it can help you make creative decisions more efficiently, collaborate successfully, and deliver on time.
When you use an NLE that is very highly metadata-oriented, and you make heavy use of its metadata features, it can open up an entirely different way of organizing and selecting your footage.
Organizing clips and folders is one of the most important tasks in editing because, if you do it properly, it allows you to find the clips that you need much more quickly. Traditional NLEs have a whole suite of features that allow you to identify the right clips quickly.
Metadata-driven editing replaces all four of those techniques with two simple but powerful functions: tagging, and searching.
Tagging happens at the beginning of the project, and it’s as simple as it sounds. The editor (or an assistant) tags each clip with all of the attributes that may be useful later. Some software calls these “keywords”, but the concept is the same.
A clip will be tagged with the obvious information like the act number, the scene number, the camera, etc. In fact, those may automatically be applied if the production crew has handled their metadata properly.
But there is also a lot of more interesting metadata to add. You might add the names of the characters in the shot, whether the director was happy with the performance, the type of shot (wide, medium, close-up), whether the shot is moving or not, whether a particular prop is visible in the shot, or whether or not the actor remembered to wear his vest.
You can also add metadata to a portion of a clip, instead of to the whole clip. In the above scenario where a traditional editor might make a subclip, you would mark that same section as “good.”
Some of this metadata can even be added automatically by computers using machine-learning technology to automatically detect characteristics of the shot.
Now that you’ve put in all of the work to tag your footage with metadata, it’s time to make use of it. “Collecting” is just a general term for how you work with metadata, and it’s as simple as typing a few words into a search bar.
Typing “4 Rachel” will instantly show you all of the shots of Rachel from scene 4. If you add “best”, then it will only show you the takes that you’ve also marked as the best ones. Perhaps Rachel begins the scene by sitting on the couch and then walks to the window. If you add “couch”, that show you the best shots that include both Rachel and the couch in scene 4. You did all of that just by typing “4 Rachel best couch”.
Now that you’ve collected the shots you wanted, you can save this view for later, but that is often not necessary, since it only takes a couple of seconds to type those same words again.
If you want a stringout of scene 4, all you do is type “4” into the search bar. If you want your selects of scene 4, you just type “4 best”.
As you can see, metadata-driven editing tends to be much more fluid and adaptable than traditional organizational methods, but it requires the editorial team to shift how they think about sorting and finding clips, which can be disruptive to existing workflows.
FCP X was the first NLE to provide advanced metadata features, but metadata-driven editing is gaining in popularity and many different tools are beginning to add some similar features.
Video collaboration solved.