Tuesday, February 9, 2016

[Avid-L2] Re: HDR workflow

 

Ill answer what I can from below. First, it is interesting that Vizio is calling their new HDR monitors "reference" monitors:
Reference Series Full-Array LED HDR + UHD Smart TV 2015 | VIZIO

 

Avid didn't have anyone to present at this event, so I really can't answer any MC workflow questions. Baselight uses a Dolby Vision LUT and they were using the plugin version out of MC for the demo. 

Not sure who is going to do HDR BluRay if and when it happens. But there is support built into the standard for it. Ultra HD Blu-ray arrives March 2016; here's everything we know

 


The UltraStudio (and Artist DNX I/O) 4K supports HDMI 2.0. So Avid could handle it if they chose to.


I do think Dolby is going to be "The Standard" because they are first to the plate. However, it gets interesting. As I understand it, "DolbyVision", if you license it, add the support for the metadata that will adapt to different displays. In other words, to get the Dolby cert. you need to have them build a LUT for your monitors. Since Sony hasn't done that, they do show the HDR, it just isn't calibrated according to Dolby. However the spec is a SMPTE spec so Sony can do their own calibration.

Mt estimate on the time frame? hmmm. that's tough. If the entities involved would be honest, and allow us to do 1080 HDR in our existing pipelines, then I would say in a not too distant future. But they have chosen to tie it to 4K so that is going to slow adoption. I strongly doubt any broadcasters are going to reinvest in all new hardware to stream something that most folks can't see. (4K) 

So the current HDR is only going to be over the top and the aforementioned Blu-Ray. Interesting note here, VUDU, who are currently streaming DolbyVision content, said they will drop resolution before they drop HDR when bandwidth becomes constrained. This is because they see an ROI on HDR with their consumers, but not on 4K.









---In Avid-L2@yahoogroups.com, <blafarm@...> wrote :

I am sorry that I couldn't watch that event.  I am keenly interested in these developments.  If you don't mind, I have a couple of follow-up questions.  I'll try to keep it simple, and if the answers are too complicated -- I will understand:

 

Let's assume for moment that you don't have access to either the Sony BVM OLED or the Dolby 4k monitor.

 

And let's also assume that you are "ok" with grading with a consumer Vizio Dolby Vision monitor (which is also NIT-limited) -- or some other manufacturer's future VS10-compliant HDR screen.  VS10 supports single- and dual-layer Dolby Vision as well as Baseline HDR10  (which is the UHD Blu-ray and UHD Alliance specification).



 

1.  Is it currently possible to use the Media Composer BaseLight Plugin (not the full product) to grade HDR?

2.  Does Media Composer currently support single- or dual-layer Dolby Vision, or Baseline HDR10 metadata?

3.  How does the HDR metadata get exported along with the media -- and what file formats will we be using to export HDR programs?

4.  Are there currently any reasonably-priced software tools that allow HDR media to be mastered onto new Ultra Blu-ray discs?

5.  Are there currently any Avid or 3rd Party Media Composer I/O interfaces that support HDMI 2.0a and the various HDR flavors?

6.  In addition to Vizio, Dolby inked licensing deals at CES with Royal Philips, Technicolor, LG Electronics, TCL and Funai.  Do you see the industry coalescing around the Dolby Vision standard -- or do you think companies like Sony and Samsung will continue to promote proprietary HDR formats?

7.  What is your personal opinion regarding when the existing chaos will settle into a predictable workflow -- 1-year, 2-years?

 

Thanks and sorry for all of the questions.






---In Avid-L2@yahoogroups.com, <tcurren@...> wrote :

Hi Mike,

We just had an HDR event at Editors' Lounge. What I can tell you now is that there isn't a "workflow" for HDR yet. We are in the alpha stages with this stuff. There are multiple "standards." 

The only shipping reference monitor is Sony's BVM OLED and it only hits 1K nits when Dolby wants 4K nits. While Dolby has a 4K nits monitor, they aren't selling it. Unless you are the size of a Warner Brothers or Technicolor, you can't get your hands on one of them. Otherwise, you are going to be using a Vizio consumer TV to color correct.

Filmlight worked directly with Dolby to create their DolbyVision workflow, so that is your current best option if you need to get one done and can get the Dolby monitor. 

Dolby has their own box to do an "automatic" conversion from your HDR color pass to an SDR (Standard Dynamic Range, or what we have now) that has minimal adjustability for fixing things. I personally think we will be doing two color passes  as you are going to develop a different aesthetic for HDR correction. Why?...

Adding the much brighter white point really adds to the impact of the picture. Right now, if there is a glint off a car window on a bright day with a blown out sky, you would probably put the sky at 100 IRE and that is also where the now clipped glint would live. In HDR, a white sky at maximum IRE (4K nits or 40 times brighter than current TVs) would give you a sunburn and force you to wear sunglasses to watch TV. So maybe the sky (which will probably not be blown out in HDR) might be at 80% and the glint on the car goes to 100%. So where do you put the face in that? And how does a box automatically adjust for that creative choice? Problematic at best.

So as I said in the beginning of this post, we are in the wild wild west of developing HDR workflows, standards, and aesthetics. That said, this is the first real improvement to our final product that I have not only been excited about, but truly see the value in during my professional career.




 

__._,_.___

Posted by: tcurren@aol.com
Reply via web post Reply to sender Reply to group Start a New Topic Messages in this topic (8)
this is the Avid-L2

.

__,_._,___

No comments:

Post a Comment