Even as a kid, I could see the visual difference between TV sitcoms - shot on videotape - and films. Video had a jarring sharpness that, paradoxically, made the picture less real. Films, with much higher resolution than video, looked more real, even though the motion blur and background softness - among other artifacts - are a clear departure from reality as our eyes perceive it.
Digital video has had an uphill battle in Hollywood, because many directors love the look of film. Quentin Tarantino's latest flick, The Hateful Eight, was shot in Ultra Panavision 70, a super widescreen format rarely used for the last 50 years, and shown in that version in only a few dozen cities.
But other directors and producers are seeing digital's benefits over film, once the issues of digital sensors are resolved. Among them are:
There is an immense amount of work going in the industry to bring the analog world of film into the digital age. For one example, the Academy of Motion Picture Arts and Sciences - the folks behind the Oscars - has developed ACES, the Academy Color Encoding System. The goal is to define every aspect of a digital video image so they can be documented and "the look" reproduced.
This is more complicated than it might seem. According to the ACES website:
. . . ACES is becoming the industry standard for managing color. . . . From image capture through editing, VFX, mastering, public presentation, archiving and future remastering, ACES ensures a consistent color experience that preserves the filmmaker's creative vision. In addition to the creative benefits, ACES addresses and solves a number of significant production, post-production and archiving problems that have arisen with the increasing variety of digital cameras and formats in use
Consider that a digital production may use a half dozen different cameras, with different sensors and file types, and integrate visual effects (VFX) from several sources, edited on multiple platforms, and then compressed with a variety of codecs for delivery on everything from IMAX screens to smartphones. It's a complex task, one that used to be handled on film by the tech printing the film positives.
There are many more standards required to make digital cinematography a consistent and reproducible technology. HDR standards are getting hammered out. High frame rates are encountering a serious learning curve as directors seek to preserve the filmic look.
The Interoperable Master Format is a proposed standard that seeks to make a single file type with all the metadata required for all the different processes a movie needs before release. And there's much more technical work going on in the motion picture industry.
All this has led to a new job title: the DIT or digital imaging technician. Look for it in the credits most major pictures this Christmas.
I expect that much of this needed digital infrastructure will get sorted over the next decade, so directors and cinematographers can focus on the creative side of pictorial story telling. But it's a reminder that digital has complicated things precisely because it is so precisely controllable.
Analog processes of the past didn't need such precision because they substituted a colorist's eye for a detailed metadata description of every variable. But don't worry about this. Just go to the movies and enjoy the show!
Courteous comments welcome, of course.