Sunday, December 22, 2019

Re: [Avid-L2] Continuing Quest to output Dolby Vision over HDMI from Resolve 15.4 using Avid DNxIO?

I don't think a skilled colorist would tell you otherwise.  At best it seems DolbyVision is a compromise so networks/services only have to deal with one file to fill all needs.  No matter what Dolby or any other pundit will tell you a separate pass is better.  That's the idea behind the 3D cheat in Resolve to make one eye HDR and the other SDR.  Then while you are working you can make screeners from the SDR eye that will play nicer for most clients.  It's all a big video cluster flop and I think my next career path will be herding cats as it seems much simpler.


On Sun, Dec 22, 2019 at 12:04 PM, Gowanus Canal wrote:
Unfortunately, that only applies to HLG standard. One hundred nits and below are treated as SDR and anything above is HDR. But the monitor must still understand HLG for this to work. 
 
DV and HDR10 do not work like this. DV HDR is done in Full video levels in at least 10bit color depth. DV SDR is an optional tone-mapped stream from the DV HDR master. And IMHO it does not look as good as a dedicated graded SDR 709 timeline, but I’m sure a skilled colorist would tell me otherwise. 
 
 

DQS
 

On Dec 22, 2019, at 11:26 AM, Pat Horridge <pat@horridge.org.uk> wrote:

Isn't the point of HDR that the majority of the signal range is treated exactly as SDR so it looks the same in those ranges.
And that the HDR is only the extension of those upper high lights?
This sounds more like the whole grade us lifted across the whole HDR range.

Pat Horridge

From: Avid-L2@groups.io <Avid-L2@groups.io> on behalf of John Moore via Groups.Io <bigfish=pacbell.net@groups.io>
Sent: Sunday, December 22, 2019 4:20:40 PM
To: Avid-L2@groups.io <Avid-L2@groups.io>
Subject: Re: [Avid-L2] Continuing Quest to output Dolby Vision over HDMI from Resolve 15.4 using Avid DNxIO?
 
Yes on all counts.  My instructor called it ecmu in class or I misheard but in my readings I agree it's the internal software cmu if you have the correct hardware which is listed as the BM Ultrastudio 4K Extreme and the PCIe card version the 4K Extreme.  I'm thinking that my Avid branded DNxIO does not qualify for Dolby metadata over HDMI.  I found a thread to that effect from 2016 but maybe that has changed.

What I have setup at home is not using the icmu and there is no tunneling involved.  The SDI HDR signal comes out of the left eye from Resolve which corresponds to the SDI_A bnc spigot on the top and bottom row of output bncs.  The right eye output comes from the SDI_B spigot on the bottom row of bncs, I suppose I could use the top row SDI_B spigot but I'm using that for the dual link output when I'm not running in HD mode.  Many types of this monitoring are limited to HD unless you get a bigger BM output card.  In my classes it seems virtually everybody including Cortex, which seems to favor Aja cards, uses HD for tunneling.

I got my concept of tunneling clarified in the class last week.  Tunneling is sending metadata down the HDMI feed that directly instructs the Dolby chip in the TV to interpret the signal correctly so it does the proper Content Mapping.  This I assume emulates what the metadata in the final Dolby Vision Master will do when the TV at home with a Dolby chip interprets the signal.  So if a Dolby Vision master was created using a 1000 nit monitor, the Sony X300,310 models seem to be the required standards by most, then the Dolby Vision metadata lets the TV chip know this and the chip then factors in the abilities of the actual TV playing back the content and Content Maps accordingly.  So a typical 600 nit home TV will scale or I assume the more correct term is "Content Map" the signal to best display on the 600 nit screen.

The Aja Hi5-4K Plus unit can turn on the 2084 flag or an HLG flag and some other choices I don't understand yet.  It also has some scaling or LUT like options that I haven't explored.

I don't think Avid has this capability to separately handle left and right eye signals in 3D mode.0 percen

The other thing that makes this work is in Resolve I have the ColourLab HDR plugin.  This is the company my instructor runs.  The Plug in will take the SDR Rec 709 and properly remap it to 2084.  This is not what happens when you turn on Resolve into Resolve color management and set it to output 2084.  Resolves internal mapping takes 100 percent Rec 709 and maps it to 100 nits in 2084.  While that may seem correct in theory in practice it's then hard to stretch out the highlights to a more HDR signal.  The HDR plugin maps things a bit higher based on brightness so it's not just adding overall gain it treatsthings in a log fashion and maps accordingly.  The result is you can color correct in resolve with all the lift gamma gain hitting the typical tonal zones and the plugin scales to HDR in a sentient manner.  Thus when everything in HDR lives well below 50 percent you get to work with a more familiar 709 style of signal and lift, gamma and gain effect the normal areas in the signal.  Without the plug in given so much more of the HDR signal live down in what was traditional lift and lower gamma levels the lift, gamma, gain controls don't feel natural. 
_._,_._,_

Groups.io Links:

You receive all messages sent to this group.

View/Reply Online (#133790) | Reply To Group | Reply To Sender | Mute This Topic | New Topic

Your Subscription | Contact Group Owner | Unsubscribe [administrator242.death@blogger.com]

_._,_._,_

No comments:

Post a Comment