11 min read

Visual Fidelity and Warnings of High Dynamic Range

You will most likely never be able to see how a film appeared during its initial run, but that is probably for the best.
Visual Fidelity and Warnings of High Dynamic Range

Movie lovers today are able to enjoy technological gains in consumer display electronics that can get them closer than ever before to the look of a movie as a director or cinematographer intended. The current generation standard of 4K pixel count, high dynamic range (HDR) color, and the increase in consumer television screen size have all made the living room experience of watching a movie at home incredibly competitive to the traditional movie theater experience.

The appeal of home theaters for their simplicity and ease of use has been widely acknowledged by Americans. Over the past decade, as home theaters shipped with an increasingly greater pixel density, we have seen a drastic decrease in the number of box office tickets sold. The current display technology combined with the rise of streaming has given way to a generation of movie-“stayers” instead of “-goers”.

Plenty of moviegoers do not care about the means of media content delivery. The enjoyment from a film for many is, at least assumedly, largely unimpaired by the quality of the screen. Television watching has decreased over the past few years for Americans due to the rise of smartphone watching for young generations, but the average daily time spent watching television for adults over 18 is still pretty high at 3-4 hours/day. The plot, acting, writing, dialogue, and blocking are all examples of structures to a movie that are impacted little by the visuals. But with televisions at such high usage within the American household, it makes sense that they would be worthy of a premium portion of a family’s budget.

I am not saying that you should spend 3-4 hours per day watching television. Just that if you are, and you can afford it, then you should at least be maximizing your enjoyment of such a large percentage of your daily free time. There are a million comparisons between standard definition DVDs and ultra high definition Blu-Rays on YouTube worth giving a watch if you ever want to realize how hard it would be to go backward in resolution. I also highly recommend  https://caps-a-holic.com/ to check out screencaps from movies.

The Good, the Bad, and the Ugly DVD Screencap
The Good, the Bad, and the Ugly UHD 4K Blu-Ray Screencap

Visual fidelity or the degree of precision with which a movie is reproduced from post-production to the consumer is important in a visual medium. Movie fans pay quite a bit of money to watch a movie on the biggest screen during theatrical debuts. But it is partially admirable that some are resistant to the argument in favor of fidelity. Movies and television have so much going on behind the scenes, with hundreds to thousands of employees working on each. Fans can watch John Wick on any screen and appreciate the plot nearly exactly as on another. However, the action, visuals, and sound should all have a greater impact on better equipment.

Size seems an obvious factor dictating an individual’s preference for watching a movie. From 1950-1980, theaters only had to compete with household televisions that averaged under 20” of screen size. For the past couple of decades, we have witnessed a surge in screen size. LCD television screens sold in the United States went from averaging 22” in size in 1997 to averaging 50” in size by 2021. The people have spoken and appear clearly in favor of screen size for providing immersion and helping visuals appear clearer.

Beyond screen size, the 4K pixel count is probably the most common and unambiguously positive feature of modern televisions. 4K content is more important now that 65” is the most popular size of television sold in the United States as the larger our screens become the more noticeable individual pixels can get. There is an argument for diminishing returns of increasing pixel density, and whether 8K or beyond will have much impact, but we are not quite at the point of zero additional fidelity gains with increased pixel count. 4K media still has noticeable benefits over 1080P. A television maxing out with resolution at 720P or 1080P is much further from the director’s intent for their film than any 4K television because nearly every film is captured at a much higher resolution.

The vast majority of films available today were shot on film. While most films are shot digitally today, it was not until 2013 that digital overtook film for the number of yearly theatrical releases. Comparing pixel count to traditional film, and especially to film projection is very difficult. For traditional photochemical productions, a camera negative is shot by the crew. That camera negative gets cut and an inter-positive is made from the inter-negative. Finally, a print is made from a final negative made from the inter-positive.  

Prior to films being mastered digitally, that print would have most often been 35mm stock, and less often 65 or 70mm. That film slides would have been subject to grime, dirt, and visual imperfections in the film and then shown by a, hopefully talented, projectionist. By the time the film was projected in front of theater patrons the quality of the picture was generally much worse than when the film was captured.

Left: Appearance of film slides from the original run of Jaws in 1975; Right: Appearance of film slides prior to projection

You will most likely never be able to see how a film appeared during its initial run, but that is probably for the best. And even though the clarity of the negatives of the film could not be projected at the time, the film negatives provide digital artists with prints of greater potential pixel density than the vast majority of digital films. Those negatives have been used over the years to scan films into a digital format, and today the vast majority of theaters project these digital scans of films instead of projecting film reels.

Film doesn’t have pixels. Pixels are a digital concept that represent the smallest controllable element of a picture on a screen. For instance, a 4K LCD television is named for the (not quite) four thousand pixels measured horizontally across the screen for a total resolution of 3840 x 2160. The name 4K is a bit misleading in that sense, as the resolution is a little more than 8.8 million pixels or 8.8 megapixels.

Instead of pixels film relies on grain. Film grain refers to the silver crystals in a film’s negative – the result of silver halides turning into metallic silver after light exposure and the process of capturing images on film. The resolution of film is a product of the area of the film (e.g. 35mm) and the film speed. The film speed causes film to have varied resolutions whereas digital has a fixed resolution.

Representation of the overlapping film grain structure of films

Laypeople comparing film to digital are often instead comparing the scans of film to digital. Film slides need to be scanned to be projected digitally or to be pressed for home media. Famous photographer Ken Rockwell has an infamous blog post on the pixel equivalent of film – he estimates it as peaking around 87 megapixels. Most seem to estimate the pixel count equivalent for traditional film to peak around closer to 18-20 megapixels. Modern digital cameras can reach that pixel equivalent with some tradeoffs from film.

Regardless, it is clear that the majority of releases, both film and digital, would benefit from a 4K transfer as they were created using devices that could capture 4K detail and beyond. 18-megapixel film would mean we could crank more than the 8.8 megapixels 4K is possible of achieving, but probably not quite the 33 megapixels 8K is capable of. Screens would also most likely have to get a little larger for 8K content to become more noticeable, although early 8K capable displays have garnered good reviews so far.

Increased pixel density and larger screens are probably two positives that almost everyone agrees on. There are diminishing returns on pixel density, but we can safely say 8K will probably get us up to and beyond the point of the vast majority of film and digital capture techniques. Without very large increases in the size of household TVs, we would most likely not need much more pixel density than 4K or 8K.

There is another feature that has become so common with consumer displays sold today that it has often become one of the main selling features for 4K or Ultra High Definition (UHD) media – high dynamic range, or HDR. HDR has a much more murky appreciation among cinephiles than increases in resolution.

HDR is a complicated subject today with all the various television manufacturers and media distributors using the term with some amount of fluidity and variability. The goal of HDR, which originated with photography, is to better recreate the captured image to the human eye. HDR programs such as Dolby Vision provide dynamic metadata to your television screen that allows for minutely adjusted brightness, color, and sharpness. All HDR branded televisions require a greater amount of peak brightness, which allows for more “stops” or adjustable light levels which provide much greater contrast and a wider color gamut.

All that sounds like a good thing! HDR allows our televisions to appear closer to captured images than ever before, but HDR also seems to lend itself towards revisions or alterations from the original appearance of older movies. HDR is often quite good in newer releases, it makes a lot of sense for modern moviemakers to shoot their films with HDR releases in mind.

Even older releases can benefit from HDR as well. Again, consumer media has not exhausted the full detail from film yet – additional gains in color and lighting should get consumers closer to the original look of a film. The dynamic range of film varies based on the film stock, but real-world use of film today can result in up to 10-13 stops from film, which is in line with the minimum 13 stops required for HDR certification.

The rub with cinephiles is that regardless if film has HDR capability, it was not meant to be displayed on HDR displays. Furthermore, most UHD 4K Blu-Rays are rescanned from the prints used for the HD Blu-Ray releases. HDR has to be digitally inserted into the older scans through the use of digital artists. These experts can take a film and expand the dynamic range, but with a resulting picture that may be quite different from the original creative intent.

You can find plenty of cinephiles completely against HDR for older movies, such as famed restorationist Robert Harris. Robert Harris, the person in charge of the restoration of The Godfather and Vertigo, claims HDR adds something in post-production that was never intended for original releases.

HDR is an option, much like ordering a different kind of leather of fabric for your new car's interior.
It's nothing new -- been around for years.
If generally NOT a part of the design of a film, with most HDR entering the picture, no pun intended, during post.
"Wouldn't look neat if those flames were Really bright orange..."
It's added the same way that 3D is added in post, to the majority of 3D productions.
It has no relevance to production photography. - Robert Harris

But 35mm film had a far higher dynamic range than we had tapped into for consumer releases, so in the right hands, we should expect some potential increase in visual fidelity with the inclusion of HDR. The added luminance alone should allow film scans to appear closer to the image displayed on the screen.

The variation with each home media release in pushing us one step closer or further from the original vision of the film may speak further to the change in creative vision over time than any limitations in technology. Creatives may want their rereleased film to appear with HDR adjustments for brighter and more colorful images, regardless if the image preserves the original intent. Still, it seems important that the original images are preserved in some form with a higher degree of visual fidelity than we have today.

HDR may differ from the original projected image of film and older movies, but HDR can theoretically get us closer than ever to the captured image. The issues most cinephiles have with HDR seem to be a symptom of a larger problem – alterations made digitally that change the original capture or intent of the film.

With advancements in media display such as from SD to HD, and then again from HD to UHD, movies have to be rescanned and worked on by digital artists. Directors or creatives from the filming of the movie may even sign on to make sure the newest release preserves the original vision of the film. This period allows for a time of revision, which has led to plenty of examples of newer releases appearing drastically different from the original release of a movie.

The newly revised movies may have better pixel density and a clearer image, but sometimes at the offset of a much worse, or at least much different look of the movie. It can sometimes be so bad as to have a George Lucas revisionist bent, as when the famous Star Wars director went back and changed the original trilogy years after release.

A drastic example is The Matrix. The Matrix was released in 1999 with a fairly cool color palette and a slight green tint when the characters jacked into the matrix to give off a dreamlike vibe.  The sequels took this a step further, increasing the density of the green tint. When The Matrix was re-released on Blu-Ray in 2008, the original was changed in appearance to look closer to its sequels and given a heavier tint of green.

Top: Original film print Bottom: Appearance on home Blu-Ray release
Top: Original film print Bottom: Appearance on home Blu-Ray release

The director of photography for the trilogy, Bill Pope, oversaw the re-release of the film on 4K media and the latest release was given a color grading much more authentic to the original release.

A less drastic example of revisionist home media can be seen in almost every single release, there really is an art to remastering a movie for modern displays. Take a look at the latest release of the Indiana Jones franchise on 4K UHD compared with their release on DVD and Blu-Ray. The newest UHD release seems to add quite a bit of film grain from the 2013 release and is also noticeably darker, but still retaining more clarity.

Either way, our comparison ends with the DVD release, as we do not have the original slides to compare creative intent.

You, your friends, Netflix viewers, and even the directors of The Matrix, the Wachowski siblings, may like the swamp monster green-tint of The Matrix on Blu-Ray. But the Wachowski siblings made a different decision when filming their movie in 1999. Their decisions at the time that led to that original theatrical release, without the green-tint, became the hit that was incepted into the brain of every millennial.

Note: I highly recommend this short video from Gizmodo on the process of restoration that the Criterion Collection goes through to renew older films.

The below three screencaps are from Jaws, each a still from either the DVD, Blu-Ray, or UHD release. Hover over the picture to reveal which is which.

Standard Definition/DVD
High Definition Blu-Ray
Ultra High Definition 4k Blu-Ray