Interviews 2013 October
Tom: Here with me today is Charles Poynton, author of Digital Video and HD, Second Edition: Algorithms and Interfaces. If you’ve not had an opportunity to hear Charles speak, then you’re in for a treat. To give you a sense of how much of an impact Charles has made in the display technology industry, here’s a quote from Mark Schubin, SMPTE fellow, with regards to Mr. Poynton’s book: “He just says it all. This is the gamma sutra, a guide to the pleasures of understanding electronic pictures. It’s like having the world’s best teacher give you a private seminar on whatever you need to know.”
I don’t know what more I could add than that. Charles, it’s a huge honor to have this opportunity to tackle some big topics here that we have in mind. There are a number of things people are concerned about and excited about as 2014 approaches with regards to the evolution of display technology and display calibration technology. Both are changing. Which concepts stay the same and which ones change as we move into a new world of display technology to produce and master UHD, 4K, and maybe even 8K based media. Charles, welcome to the show.
Charles: It’s a pleasure to be here, Tom. Thanks for inviting me to participate.
Tom: Well, great. I’m just excited to have you hear. You’ve been instrumental in defining the standards for display technology since the 80’s actually or maybe even sooner than that. I’m not going to even attempt to summarize. I say that because we have a mutual friend from the 80’s in the display area. Tell us a little bit about the key things that you’ve done in the industry. I would prefer that you summarize it for me.
Charles: Oh, gosh. Being Canadian, I’m not one to be blowing my own horn. If you push me to do it, I’ll do it. Well, I cut my teeth designing and building and soldering together hardware for studio video in the early, early, early days of digital video. That was my thing.
Tom: You were a hardware guy.
Charles: I did in those days both hardware and software. Because I’m an old-timer back, back in those days you could effectively do both. There is only a small number of people who are really conversant on both sides of that. I feel myself privileged and lucky to grown up in that era.
I did also in that era plug into the SMPTE standards community. I did have hand in several standards for digital video, including even Rec. 601, the very first digital video standard for SD. Then HD, the SMPTE 274M standard for 1080 came from my fingertips actually. I was the document editor for that standard.
Those were the early days. Then since those days, I’ve discovered just personally in my own work that although I continue to do work in straight up HD and digital video and even digital cinema, the really challenging part for me is the color science. The last – I don’t know – 8 or 10 or 12 years I’ve been spending a lot of time doing straight up color science. That then touches the calibration and display part of your question. I’m also very much involved in the image science issues behind emerging technologies – 4K, UHD, 8K those things. I’m pretty much in touch with the development of those technologies.
Tom: Are you still involved in SMPTE?
Charles: Less so.
Tom: There’s a lot of travel for that I know.
Charles: Well, they do still travel, which some people would argue they don’t really need to do anymore. If I can be really blunt and un-Canadian, there is an element of kind of old boy-ism.
Tom: Yeah, that’s what I’ve heard.
Charles: Yeah. That part is bad. Opening it up to the young kids and making it possible to telecommute to the meetings would be a good move to shake things up a little bit. I’ve noticed that there are serious problems in that domain with standards being written that really are just on the verges of being outside the expertise of the people that are writing them. If you don’t get people to go to the meetings who are really these super experts, then you end up with bad standards. I am still involved to some extend but not as much as I used to be.
Tom: Okay. Well, moving on. The title of our conversation is “Display Technology and Display Calibration.” What stays the same and what will change as we move into new world of mastering this new media and these high resolutions? Would it be fair to say that many of the basic concepts in display technology remain the same, regardless of what standard you’re mastering to? For example, I know you’ve written a series of 18 articles on color science topics for SpectraCal for the display calibration specialists. I’ll put that link in the blog for this.
Charles: Yeah. I must say I’m quite proud of those. They were one a month for a year and a half. Each one of them is exactly two pages. They’re I think a very good encapsulation of issues that are still relevant today in displays and displays calibration.
Tom: I agree. I went through them all here recently or most of them anyway just in preparation for our talk. They’re called “The Charles Poynton Vector.”
Charles: Now, there’s a little background story behind that. Actually strictly speaking they’re called “Poynton Vector.”
Tom: Vectors. Oh, okay.
Charles: No, and not even plural. I mean there are more than one of them. There are 18 of them, but they’re all in the “Poynton’s Vector” series. The inside of joke is this that my last name is Poynton, as you know. There is another completely unrelated guy from I want to say even a little more than a hundred years ago. His name is Poynting. So P-O-Y-N-T like my name but then ending I-N-G. He’s a guy who worked on the deep theory of electromagnetic radiation in the old, old, old, old, old days just after James Clerk Maxwell – Maxwell who really was the first guy to identify the physics behind the propagation electromagnetic waves.
Anyway, I’m getting ahead of myself with this story. Poynting did some math that hinged on what Maxwell did. Pointing identified that the cross product of the E vector and the H vector was the direction of propagation. A really, really super important thing. That vector became known as the “Poynting Vector.” When it came to name this column, I couldn’t resist.
Tom: Couldn’t resist for those in the know.
Charles: That’s where we get “Poynton’s Vector.”
Tom: Well, all right. With regards to the topics that are potentially the same as we move from current media to higher resolution media in a color science perspective, let’s say, the things that could potentially stay the same, you know, we still have to set black levels. There are white levels, gamma settings given for a specific environment, decoding de-matrixing, I suppose too. What stays the same as we move from HD to 4K?
Charles: It’s potentially a deep question, but let me touch on just some major items.
Tom: Sure.
Charles: White point, everybody agrees is just D65, so let’s lock that down. We really should specify the luminance of white. Now, I’m using luminance as a very specific term. It’s the measurement of the physical powers associated with the light that you produce that you call white. It’s measured in candelas per meters squared, otherwise known as nits. Nits is just the colloquial slang term for candelas per meters squared. It’s a huge failure at the moment in all video standards, including even studio HD, that that number is not specified in a standard.
Tom: Interesting.
Charles: Yeah, it’s very interesting. I personally think and a dozen of my colleagues think that that number should just be standardized a hundred nits. A hundred nits is the level that you would be using when you’re grading video. I mean I’m not sure exactly what you do at your place; probably a hundred candelas per meter squared.
Tom: Yeah, I’d have to go back and check, but I think that’s really close.
Charles: Thing, though, is in terms of standards, that’s not standardized. Pretty clearly it should be because our visual perception of colors depends upon the amount of light being produced for those colors. We sort of scale up and down the whole scale. We can experience a colorful picture from a movie screen even though the maximum luminance is 48 nits. In a movie screen, the typical luminance of a white shirt’s white or a white piece of paper’s white would only be 24 nits. We still get that colorful experience, but the perceptual experience is modulated by the overall amount of light. That needs to be standardized. It’s absolutely clear.
Call that an open question at the moment. That comment that I just made kind of inches back toward that comment about setting SMPTE standards because that should be a SMPTE standard. Honestly, there’s an extremely small number of people participating in the SMPTE standards who have got the experience in color science and even color appearance modeling that would inform them about how to choose that number. So get the youngsters in for help is what I say.
Tom: Sort of a related question, in calibrating the Sony OLED’s, don’t they actually use a different white point?
Charles: Well, that’s a very tricky piece of business. To encapsulate in two sentences, basically the display technology that we’re used to for the last 25 years, CRT’s – or longer than 25 even but let’s just say 25 – has gotten reasonably wideband energy across the visual spectrum. It turns out the red is not wideband. If you’ve looked at the spectral output of a red phosphor on a Sony CRT, you’d see that it’s two very narrow peaks.
So there are two peaks for the red and then a hump for the green and a hump for the blue. In the case of the OLED, there are three rather narrow peaks of energy – one for red, one for green, one for blue. The fact that they’re fairly narrow in spectral terms, well, it has two implications. One implication is you can make much more colorful colors. You can call that wide gamut if you’d like. The emerging UHD TV or what I would prefer to call UHD standard it does also have quite wide gamut. That’s one beneficial effect of the characteristics of those light sources in the Sony OLED display or anyone else’s OLED display, but Sony is the best example today because they’re making a studio display.
Well, both a studio display and a so-called professional display, which really means one notch below studio but still extremely good. Fact number one: It’s potentially wide gamut. That’s a big deal. Fact number two of the OLED narrow spectral emission is that it potentially causes a problem that different observers see slightly different colors, including they might see different colors for white when you light up all three of red, green, and blue.
You can potentially get a little shift in the colors that people see for white. Sony has floated out what I would call a – well, I guess I can be a little bit blunt in calling it a short-term fix for that problem, which is just altering the white point slightly. They are mitigating the problem of observers seeing slightly different colors. Let me give you the color science name for that. It’s called metamerism.
Tom: Metamerism.
Charles: It’s M-E-T-A-M-E-R-I-S-M, which you might think would be METAmerism, but I can tip you off that it’s the secret handshake among color scientists to pronounce that meTAMerism.
Tom: I’m in the in group now.
Charles: You are now in the know. So that when you hear someone pronounce that metamerism, which would probably be your first guess, then you can say to them, “Uh, actually we in the know know that that’s pronounced meTAMerism.” That’s what they’re trying to address. That is an issue in LED and, by the way, also in laser displays. Now, I personally believe that we are not likely to see laser displays in consumer environments, but we will see laser displays in cinema. There is no doubt about it.
That issue of – and I’ll put a qualifier in front of that – observer metamerism, it’s likely to arise. I mention observer metamerism because there’s another flavor of metamerism called camera metamerism, which is where your camera doesn’t see colors the same way that we see them. That’s not an issue on the distribution or display end.
Tom: Well, yeah, it’s actually something that Luhr Jensen had mentioned a number of times in our conversation. Now I have the right way to say that. Before we move into display calibration related questions – I have a couple of those – what other concepts in color science that are related to what we’re talking about here stay the same as we move from SD, HD, up to 4K, 8K types?
Charles: This is a super good and interesting question. The key concept that pulls all of the standards together, even though there are different standards, the key conceptual element for digital color imaging is additive RGB mixture. All of the current standards for electronic images, both the transmission and the display, are based on that premise. The idea is that you add a certain amount of red, a certain amount of green, a certain amount of blue to make all the colors that you can make.
You can’t ever make all of the colors. You can make most of them, and you can certainly make all of the important ones. That concept of additive mixture is then built into the standards for exchange and mastering and display. However, certain display physics adhere to that principle directly like plasma and DLP and OLED. Certain other display technologies like LCD do not do that in the physics. If you’re faced with trying to get a decent rendition out of an LCD display, you need to find some way to make it behave like additive RGB mixture, even though the physics of the display technology doesn’t conform to that quite.
In LCD displays, they’re sort of reasonably, plus or minus, fairly close, but they’re not close enough for studio use. That issue gets us into what do you do to overcome it. Well, you calibrate. In the case of a display that’s got intrinsic non-additive mixture – obviously in LCD’s there are small numbers of – well, I’m thinking small numbers in billions. It may only be 700 million. There are a lot of them. To make those usable at the high end, you need to compensate. That gets us into the high tech arena of display calibration and building 3D interpolated lookup tables to fix it.