The title sounds like the worst Sci-Fi story ever written. Sadly, this post is just about digitally acquired film workflows, but the image below is still appropriate.
Before starting, I want to welcome Ben Baker to the SIXTEEN19 team. Ben used to be Head of the Digital Lab at Framestore in London. I met/worked with him when he was Post Supervisor on the Chronicles of Narnia: The Voyage of the Dawn Treader.
(That’s Ben in the center.)
I worked closely with Ben on location in Australia for nearly six months. We did some very exciting, dare-I-say pioneering work on that feature (at the direction and under the guidance of the great Jonas Thaler from Walden Media.) I think Ben and I would agree that that was a pretty damn great job to be a part of and all of us who worked in any capacity in Post were a tight knit group. It is an honor and a pleasure to work with Ben again!
Ben has an excellent blog, located here: http://blog.bluefishbaker.com/
Ben adds considerably to our team, which already had dozens of years of combined Post experience on hundreds of feature films. We are one of the few companies that has designed, executed and successfully completed digitally acquired projects where our clients not only don’t hate our guts, but they rave about how well the jobs went. Seriously, find three companies that have managed end-to-end workflows for digitally acquired features and the studios give them high marks. I’m damn proud of our track record, because it is very hard work. I’m talking nights, weekends, ordering in.
It’s been awhile since I blogged. Mostly, I’ve been away managing Post on some big/cool studio projects in Louisiana. Meanwhile, I’ve been working with our team at SIXTEEN19 to build tools that will serve as the “mobile software brain” of feature film workflows in 2011.
Very excited about these tools, since they will be our own proprietary products and I believe that our experience and approach towards managing digital jobs is spot on.
The approach centers on near-set media asset management (MAM.)
We at SIXTEEN19 believe that the campaign to achieve perfectly organized metadata is the core of a robust, successful digital workflow and that effort must begin on set. HOWEVER, unlike some of our competitors, we believe that trying to do too much on set is unwise.
Culturally and functionally, Production is a different world than Post.
Production and Post are merging into one networked digital environment, which is the focus of this post. However, it is important to realize where it makes the most sense to organize and check material and that will never be on set.
It is more expensive to operate on set, there is an inherently higher percentage of error that occurs there and the environment is unpredictable, hectic and not the best place to meticulously and patiently execute a careful process.
Yes, tools should be present to build efficiencies in organizing and moving both verified files and metadata to near-set dailies… However, the movement towards onset dailies is a recipe for problematic finishing!
Finishing your film efficiently (ie quicker and cheaper) is the end that must be in mind when organizing your digitally acquired files and metadata. At some date, perhaps months or even years in the future, someone will need to take a list out of an editing machine and match back to every element in a film. We believe that it is the right of every conform artist and studio to expect that this process should go smoothly.
Elements may be located in a number of different places, from live spinning disks to long-term archives. Restoring and conforming a film (specifically digitally acquired films) is only as good as your database! The data in that database and the tools you have at your disposal to view and sort through that data are of paramount importance! So logically, the process and execution of the task of getting accurate data into that database is also vital (but anything but a given!)
I’ve hinted that good metadata management is not best accomplished on set. It is, however, accomplished near-set and can be done very near location, even on the same LAN (local area network.) On Narnia, we passed files directly from set to editorial over Gigabit Ethernet.
What we are at SIXTEEN19 are in the business of creating are not Production workflows or Post workflows, or even Production-to-Post workflows. We are creating networked digital environments for feature films.
So what does a networked environment look like and how is it managed?
That is a broad question and the toolsets required for all professionals working on a film to contribute, modify and change metadata about a film are many and evolving. The heart of a networked environment is the mobile software brain that I referenced above and SIXTEEN19 is hard at work developing ours for 2011 shows.
So what does this brain look like?
The brain is a powerful back-end database which contains record of all of your assets and related metadata, from camera original files to art concepts to previs to script to barcoded inventory. Sitting on top of all of these files and associated metadata are web-based tools. You can log in and open a custom view of data fields that allow you to access and view files and data in ways that are tailored and useful to your purpose. Furthermore, metadata tags associate different files and metadata so that they can be searched and sorted in ways that allow professionals to access and work on files in ways that are useful.
That all sounds complicated and complex, so let’s walk through an example workflow and talk about the best way to manage it with the tools that we are building at SIXTEEN19.
Day 1: A script is broken down and checked into a database. A script supervisor spends time entering detailed information about the script into the database, so that the database can distinguish different parts of the script, such that it is “aware” that a particular line of dialogue is part of a certain scene and that, say, a certain scene is a VFXs scene. The script supervisor continues to update the database as the script goes through versions. The current version takes precedence in the database.
Day 90: A Previs artist finishes a detailed animated version of a scene. He renders a QuickTime of the sequence and checks it into the same database as the script. Tools allow her to specify that the QuickTime is Previs of VFX Scene 7. Similar to the script, each new version can be updated and take precedence so the database is “aware” of the latest version.
Day 150: The Art Department finishes detailed conceptual drawing and renders them as .TIFF files. The TIFFs are checked into the same database and given metadata tags to associate them with the scenes to which they belong.
All of the above elements in the database have thumbnail images in the database, as well as custom metadata fields that can be simple or more complex, depending on how deep professionals want to go in linking disparate elements together.
Day 180: The director is on set. Today’s scene is a motion capture of a highly complex action sequence where a baseball player is throwing a larger bolder in an animated world.
The director opens a window in the same database where the script, Previs an art concepts are checked in. He searches by the current scene, in this case VFX Scene 7. All of the above data lives on a central server that is accessible by any professional given a log-in and permission to view certain data. (This is the networked environment.)
The director can open a container window that shows both the broken down script and perhaps by clicking on a line of dialogue and holding down the “control” key, he can choose to see the latest Previs QuickTime associated with that line, or the Art Department stills. Having watched the Previs, the director can position actors and even bring the Previs up in the view finder of his digital camera to see how the actors will look in relation to the digital world. Having done this, the director shoots the scene.
Also Day 180: A DIT (Director of Imaging Technology), Sound Recordist and Script Supervisor are all busy (in this case) hand writing reports on forms related to picture, sound and script shot on set. The DIT writes reports per camera roll (in this case a CF card that files are being recorded on.) The Sound Recordist records notes about each take of sound by Sound Roll. The Script Supervisor records notes on each shot per shoot day.
The cold hard truth about the above is that all of the data entered by the DIT, Sound Recordist and Script Supervisor is suspect. Yeah, I said it. There is a margin of error on set approaching the 2-5% range. That means that 2-5% of the data entered is flat out wrong. The take name may be written incorrectly, the camera roll or sound roll may be wrong, a take may be listed on a report that never existed to begin with… This is why it is not a good idea to pass this data upstream without careful review, detective work and double checking. Passing on dailies without a detailed dailies process is a mistake. Even if data is eventually fixed in a nonlinear editing machine’s database, that fix does not translate to all departments and may occur after important processes such as creating dailies and archiving have taken place – far too late to avoid problems during conform.
It stands to reason, given the previous examples, that the shots captured on set will also be checked into the master database (networked environment) with a variety of metadata tags that associate those shots, their associated files and other media related to the film.
A sensible workflow in the minds of my colleagues and I at SIXTEEN19 does not include trying to quickly organize the metadata from those onset reports and create dailies on set. We do believe it is a good idea to have H.264s and other files transcoded and viewable on set, as well as media captured for QC. This allows professionals on set to view dailies and make notes on iPads and other cool devices that make producers happy and carry forward to dailies. We, like our competitors, have solutions to do this but we do not see this as the core value of digital workflow management.
The core value is the software brain and the service of providing perfect metadata that exists in a networked environment.
Dailies are best left to near set, near editorial, where Post professionals can do what we do best, practice obsessive compulsiveness. (In just a minute, right after we wash our hands, again.)
In our opinion, making editorial dailies should not happen on set because the metadata passed on to nonlinear systems will eventually be your friend or foe months later and it is really your choice.
The files created on set (picture and sound) should be copied with verification and passed from set to a near set dailies department. That is where you need a highly experienced, extremely diligent dailies team that can untangle the many mysteries that are inherent to Production.
Remember, I am asserting that there is a margin of error of 2-5% on set (for a very good Production team.) The margin of error for a good dailies team should be 0%. Good dailies take time and should not be subject to breaking down and moving equipment, holding up transportation or pressure of Production distractions and schedules.
I have managed these jobs and on any given day it was my responsibility to make sure that the picture and sound files I received from set were checked in properly and that all metadata that arrived from set corresponded PERFECTLY. This is a full time job with endless phone calls and puzzles to solve each day. These puzzles end up requiring complex solutions at times, with new procedures, details communicate, problems to ring up the chain, future problems to mitigate by anticipating obstacles to conform…
If one allowed just one issue to carry forward or remain unsolved each day, then on a 50 day shoot, you are looking at 50 issues that will overwhelm someone later, costing time and money and jeopardizing schedules.
In dailies, every day must be put to bed with no lingering doubts or questions. Without the proper tools and expertise, it is easy for the conveyor belt to get away from you.
If there were any discrepancies between files and associated metadata as entered, which there are each and every day, it is our job to do detective work. Often, several takes of sound may be on one file. There may be a sound file for a take, but no corresponding picture (or vice versa.) Regardless of the error from set, by the time dailies are organized and matched to reports, there will be several amendments to reports and corrections to metadata.
Enter the SIXTEEN19 software brain that lives within the networked environment. (In development, with base functionality already available.) These tools represent the next generation of filmmaking workflows.
In our model, we get files and check them into the same database where all of the files and metadata described in the above examples live. That database is aware of the directory paths of each file. As we copy and organize files, our software brain not only verifies that every byte matches the original files, but it logs into the database where each file can later be found in case we want to do something novel like deliver a file to VFX from an EDL or restore original camera files or different versions thereof (DPX proxies for example) from our choice of disks or archive. (Should be easy, but isn’t on most jobs.)
There are three varieties of metadata and a good software brain should be able to handle them differently based on their characteristics.
1) The first variety is metadata embedded in files. This data, such as timecode, is added by the camera and should be correct at time of acquisition. Unless flat-out wrong, this data needs to be extracted and added to our database, but should not need to be changed. If it is wrong, then that data needs to be changed once and only once. This is not data that needs to be versioned and worked on by professionals. It is what it is, to borrow one of my least favorite phrases.
2) The second kind of metadata are the camera, sound and script reports referenced above. (Plus any other metadata carrying forward from set, such as QC notes and color notes/LUTs.)
During dailies creation, this metadata needs to be patiently and meticulously checked. Any discrepancies need to be corrected and logged, with older versions maintained for reference. Whether the reports are entered on set by hand or electronically, SIXTEEN19 believes that one of its central roles is to create a new, corrected view of electronic reports in our central database that is perfect.
Here, and only here, under the guidance of an experienced dailies producer can anything akin to perfect metadata be achieved. Only when something akin to perfect metadata is achieved, can efficient collaboration and conform occur. These are the hallmarks of good workflows!
During the dailies process, each file (examples: right eye/left eye of a 3D shot and sound) must be built into a shot. The metadata in these shots is checked versus reports and the shots are graded. The resulting shots should be checked into our database and given metadata tags to associate them with other metadata in the system. Shots can be viewed, changed and versioned. We can create a container, for example, that now adds the motion capture shots from Scene 7 into the container that houses all other Scene 7 media. We can now view every shot and furthermore view the current VFX versions of each shot as they are created, complete with metadata about each distinct VFX process and its progress.
The end result of creating an environment with a smart brain is that you have new, electronic reports that are far more accurate than anything that can be achieved on set. These reports can be customized by professional and use. You can then view these reports and click on them to link to notes about each shot, other elements associated with each shot or links to the files themselves. You can also use such a software brain to record the fact that you have transcoded shots or output/delivered them and view reports about the location of physical media (such as a barcoded LT0 tape or a DVD) and its precise contents and the physical location of that barcoded asset.
What’s more, now that you have truly accurate metadata in a database, you can use that database to generate files such as XMLs or ALEs to an editing system. After you have checked your metadata in dailies, an ALE generated from accurate data will translate to accurate lists down the road and fluid conform/restoration, which is the end goal. Any metadata appended in a nonlinear system can be exported to the smart brain and a new version of that metadata will take precedence over older versions, which are stored as reference.
Truth: A central, storage aware repository for files and metadata that exists outside of any given product line is a missing link to the next 100 years of filmmaking. We are working to make this a reality on shows that we will be doing in early 2011.
This brings us to the third and final flavor of metadata, which is the kind that will set good software brains apart from bad ones.
To review, the first flavor of metadata is embedded in files and should simply need to be transposed. The second flavor is data about the production elements that is recorded on set and should be corrected/amended during dailies creation, but will usually remain static in the database after that.
The third kind of metadata is fluid and forms your workflow pipeline. This is metadata that changes as professionals work on the film. Much of this metadata does not exist when files are first created and is added, changed and versioned as processes are performed to shots.
A smart software brain will offer ever-expanding customized tools that allow professionals to access and view data and add and append data, all while keeping track of each version in the networked environment that everyone working on a film can share. That central software brain will accept simple exports of data from multiple systems in forms that are industry standard, such as XML, ALE, EDL, AAF. As tasks are performed, updated metadata will ripple through into current versions of cuts and all elements can be restored in an easy process of simply connecting storage or inserting tapes. Schedules will update and professionals will be managed as further toolsets are connected into the functionality of the brain. Many software packages such as Tactic and Shotgun offer much of the functionality described, but require metadata to be wrangled and managed in ways customized to digital workflows. Today, this process still requires separate systems and processes, but we are rapidly marrying these and developing the tools to tie everything together.
A perfectly executed dailies process, beginning on set but managed near set with a dynamic software brain should translate to near seamless conform (and rather resemble the Guggenheim Museum in New York.)
One last thought: Remember that anyone who has experience running these jobs can tell you that anything on set has a margin of error. Well, Production problems also occur. On a 3D job, for example, the cable that is supposed to run timecode to one camera may be unplugged meaning that when a list in generated, the timecode reference to a shot is essentially nonsense to conforming software.
Traditionally, when a conform artist runs into a problem, it means unraveling a mystery and lots of swearing. There are usually no standardized notes accompanying turnover and artists often resort to digging through old emails to figure out why something that was supposed to be automated did not work.
Imagine if you will, the software brain we are developing in action. That list gets imported into our software brain. Provided we managed the dailies with our software brain, that artist can simply open up an electronic version of all of the reports from set related to that scene and click on the shots in question to open up a notes field. That notes field will explain that there was a production problem and provide the corrected offset from the timecode on the other eye to provide a quick solution to the problem. Similar mysteries that currently require manual detective work, are fixed nearly instantaneously.
A properly conceived and managed software brain within a networked environment can serve as a powerful communication platform that preempts many of the most laborious, frustrating and expensive problems encountered in Post.
Problems that exist today such as (sad but true) wrong versions of effects ending up in finished movies because the professionals lost track of versions, will be a thing of the past. And lastly, studios will be able to leverage all of their films in their vaults at any time. It will be as easy to unarchive and reconform a film ten years from now as it will be three months from now in the DI.
In six months to a year, it is an important concept to understand that your digital workflow will only be as good as your environment and the software brain on top of it. And as always, management of that brain is only as good as the vendor operating that brain.
If you are considering a digital workflow today, please contact us and we will coach you on how to employ much of what was just discussed today. We expect to have a fully functioning smart brain operating at the center of shows working on shows that we begin in Q1 of 2011.
Posted in Uncategorized
Tags: AAF, ALE, Ben Baker, Chronicles of Narnia: The Voyage of the Dawn Treader, conform, database, digital acquisition, digital workflows, digitally acquired films, EDL, Frame Store, gigabit ethernet, HXML, Jonas Thaler, LAN, Local Area Network, LUTs, MAM, Media Asset Management, metadata, metatags, mobile, mobile services, mobile software brain, near-set, perfect conform, Post, Production, Shotgun, SIXTEEN19, Tactic, Walden Media, web-based tools, XML