This week's blog dives into the world of cinema image quality and format. Tom Bert, guest writer and digital cinema pro at projector manufacturer Barco, examines the digitization of the cinema, the buzz around HDR and what kind of image quality you can expect to see soon at a cinema near you!
- A historical take on HDR buzz
- From DLP in 1999 to the 2nd wave
- Is consumer technology the driving force?
- The importance of content (and price)
- Is 4K the new normal?
A historical take on HDR BUZZ
In 2018, HDR or High Dynamic Range has been somewhat the talk of the town related to image format and image quality in cinema. Given all the attention and buzz of the recent months, it’s easy to forget about how these kinds of new-technology-discussions go through an evolution of themselves.
Most of us probably still remember the attention for (higher) frame rates that came with the 2016 release of “Billy Lynn's Long Halftime Walk”; and even going back 2011 when James Cameron demonstrated 3D HFR footage. But who still remembers the curve that 3D technology (the days of “full-resolution triple flash”) went through? In this article we zoom in on the evolution and state-of-play of another image quality metric: resolution. In what could be called a historical battle between 2K and 4K; resolution has gone through a lifecycle of its own in cinema.
From dlp in 1999 to the 2nd wave
On June 18, 1999, DLP Cinema projector technology was used for the first digital cinema screening, that of Star Wars Episode I: The Phantom Menace. The projectors had a native resolution of 1280 x 1024, and the 2.35:1 aspect ratio was achieved through an anamorphic projection lens. That means that when this wonderful industry of digital cinema was born and the first steps towards the new future for a whole industry were taken, we were not even using 2K resolution. Even more: we were stretching the pixel horizontally to fit the scope format. In the context of those pioneer days, it’s perfectly understandable that this was accepted: other, more important issues had to be solved back then when the whole concept of digital was new. And also, the 1280x1024 resolution was not at all bad given the industry standards of display technology those days. In 1999, more than 90% of TV sets sold in US used CRT technology.
When DCI (Digital Cinema Initiatives, LLC) was formed in 2002, this organization put forward the standards that would define digital cinema as we know it today. Two resolutions were included in the standard: 2K (2048x1080) and 4K (4096x2160).
Since that standardization, the cinema market has now almost achieved 100% digitization. That so-called “first wave” has left the market with the majority of the screens and projectors being 2K. No official numbers exist, but a 80/20 split between 2K/4K is likely . Several drivers have contributed to that. Cost was a first one: 4K technology requires more capacity, bandwidth and hence investment than 2K. This applies on the one hand to the content creation part: cameras, rendering computers, storage, transport, … were in the early days significantly more expensive in 4K compared to 2K. But also on the exhibition part: 4K projectors and media servers had a price premium when introduced to the market.
Another effect was the availability of 4K content. Titles released in the 4K were minimal in the early days of digital cinema. For instance, in 2012 only 11 movies were distributed to cinemas in 4K. This meant that the incentive for the industry (and exhibitors specifically) to go to 4K technology was also small.
Today, almost 20 year after the formation of DCI, cinemas are looking into renewal of their projection equipment. Zooming in on resolution, what is the outlook for the 2nd wave that lies ahead of us?
Is consumer technology the driving force?
The cinema industry has consistently managed to innovate the viewing experience ahead of home entertainment. Today however, we are on the verge of being leapfrogged by consumer technology.
Prices for 4K TV’s are coming down fast to a level where they are lower (for larger screens) or competitive to FullHD (1920x1080, the consumer-equivalent of 2K) sets. This leads to a fast adoption of 4K TV technology in the home: already in 2016, more than 50% of all TV’s sold in North-America were 4K. For over 60-inch TVs, UHD accounted for 96 percent of the shipments. This year it will be 99 percent.
Note that 4K in the home is actually 3840x2160, just below the cinema specification. This is also called “4K UHD” to indicate the difference. This year marks the sixth year since 4K UHD was commercialized back in 2012. In 2000 it was HD (High Definition, or a million pixels) and in 2006 it was FullHD (2 million pixels). Every six years there has been a generation shift. The message is clear: After a 10 year run, FullHD in the home will soon be obsolete at the high end, just as HD went extinct in 2009. 
4K movie releases for the home are picking up fast with 4K UHD Blu-ray releases (not streamed content) titles growing by almost a factor 10 since 2012. The growth in 4K cinema releases is also clear, but not to the same extent yet as the movies released to the home. Still, in just 4 years’ time, the numbers of titles more than tripled.
The Importance of content (and price)
Going into the 2nd wave of digital projection in cinema, the stars are aligned differently compared to 10 or15 years ago. Movie-goers are now finding 4K normal: can it also become (the new) normal in cinema? Will patrons even accept and understand when their preferred movie medium lags their TV in a certain specification?
Let’s look at the first driver: availability of content for the cinema. As more titles are being distributed and promoted in their 4K format, exhibitors will be incentivized to embrace and adopt 4K projection technology. Looking at the availability of titles for the home mentioned above, you see that content creators are organizing their workflows around 4K. This does not mean that there will be a 1-on-1 spillover to cinema; nor that 4K will reach 100% market share just like that. But there is an undeniable trend emerging. This trend should be combined with another one: the dropping cost of computing power and network bandwidth. Stimulated among others by overarching trends like cloud computing, the cost of rendering and processing 4K has dropped to levels close to those of 2K. Also the complexity and costs of transferring 4K content (which has a larger size than lower resolution) is now no longer limited by – the costs of – internal and worldwide network capacity. Looking at local networks inside a cinema, postproduction facility or studio lot, these are stepping up from Mbps to Gbps (Giga bit per second) capacity.
The 2nd driver for exhibitors will be the price difference between 2K projectors and servers and the 4K alternative. Initially this premium was higher than 10%, but this is coming down as well. On the one hand driven by standard price erosion, but also thanks to introduction of new technology. Miniaturization and availability of more powerful processing chips will make it soon possible to offer next generation 4K products at a price point of 2K first gen products.
IS 4K the new normal?
Historically, 2K always had the edge on 4K in cinema. The first digitization wave made the market land on a 80/20 split. Since then however, we have seen forces in the consumer space make 4K the new normal. The disappearing cost and complexity hurdles in cinema, could also make it the new normal for the 2nd wave. What is your prognosis? Do you think that exhibitors can afford to not choose 4K when making their technology choices for the next 10-15 years?
Reprinted by kind permission of the author. The contribution was first published on LinkedIn!
More blogs you might like:
4K, 8K, 16K, the race for resolution is on!
Five technology watersheds