Tuesday, March 19, 2019

Re: [Avid-L2] Avid AAF to Resolve Timeline, Video monitoring and image scaling heirarchy?

 

You are correct it's Ver 15.2.4.

So I think I've found a way to work in light of your suggestions.  I didn't do a disk speed check as if I go to the edit tab I can play 23.976 perfectly even when the color tab won't.  I figure if the drive throughput was the issue it would happen in edit tab too.  It seems that the color tab is more taxing on the system.  This is a timeline with no color correction applies just source side LUT which is also present in the edit tab so it's not like the edit tab bypasses the LUT.

So if I leave the timeline settings to 4096_2160, set the video monitor to UHD 32160P 23.976 and set image scaling to output resolution custom 3840_2160 I can get solid 23.976 playback in the color tab.  If I check match timeline it will only playback at max 20.5 fps.  Also if I set the video monitoring to 4096_2160 it also won't playback at 23.976.

One curiosity I found is when I set the image scaling to 4096_2160 match timeline the resulting down convert on spigots 3 and 4 does not follow the scale image to fit but stretches the image to fill the screen losing the letterbox.  I'm pretty sure I've seen similar behavior running Avid through the DNxIO and feeding my LG OLED with the HDMI output.  If I set the Avid project format to UHD I get the proper letterbox with scale to fit image and I see it stretch when I set it to stretch.  It seem this behavior is the same in Resolve just Resolve calls it stretch to corners.  I guess this makes sense as I'm telling the systems the video hardware is 4096_2160 when it isn't so it no longer follows the letter box or stretch parameters.

As far as setting the GPU manually I did that and it works and plays 23.976 in color tab.  If I just leave it in auto and uncheck use display GPU for computations it still works.  Even if I check use display GPU for computations it still works now that I've set video monitoring to UHD and image scaling to UHD.  Perhaps because I'm just playing back LUTTED clips with no color correction the vram is not being taxed.  According to the Resolve preferences the two TitanX GPUs have 12GB or vram and the GTX-680 has 2GB.  So for now I'll leave it in auto and uncheck use display GPU for computations.  It will be interesting to see how performance is as I add color correction and noise reduction stuff which is taxing.  A while back I recall downloading a Resolve project designed to test system performance.  I'll have to poke around and see where I put that.  I forget where I found it but IIRC somebody posted a link to a small project that had a timeline that would stress a system.

Thanks for the info.  Is there a configuration guide or is it in the monster manual how GPU vram is limited to the smallest GPU in use?  That kind of info is invaluable in troubleshooting.


---In Avid-L2@yahoogroups.com, <bogdan_grigorescu@...> wrote :

-can you run bmd disk test on the graid and owc? 5GB test file size
-make sure 680 is unchecked in preferences>memory and GPU>GPU configuration - the VRAM in resolve is not additive, but constrained to the lowest amount of any GPU enabled card
-resolve loves CUDA and your cards support it, so I would force it manually
-resolve 15.4 does not exist yet - I hope you're not running the app store version which is riddled with limitations 
-unless delivering to 4K, I would try to set the timeline to 3840x2160 and set image scaling>image scaling>scale entire image to fit

BG
www.finale.tv

On Tuesday, March 19, 2019, 4:01:20 p.m. PDT, John Moore bigfish@... [Avid-L2] <Avid-L2@yahoogroups.com> wrote:


 

This is probably more of a Resolve question but I'm aaf ing from Avid so perhaps someone on the L2 has battled this.

I have a macpro mid 2012 cheese grater 12core 2.66GHz, Mac OS 10.12.6 MC 2018.12.1, Resolve 15.4 Studio.  I have a GTX-680 in the tower and I have a cyclone microsystems expansion chassis with 2 TitanX GPUs, not flashed for mac but the 680 handles the GUI.

I'm working with Panasonic AVC-Intra 4096_2160 YUV 10bit Rec 709 media attached to USB 3 from a Graid and or a OWC Dual Elite. 

I set the timeline setting in Resolve to 4096_2160.  I have an Avid branded DNxIO running the latest Avid Desktop software for it.  My ultimate goal is to mimic how I use the DNxIO in Avid where I feed an LG OLED 55 inch with HDMI and then take advantage of setting the DNxIO to Single or Dual Linik so spigots 3 and 4 provide an HD downconvert.  Then the LG can help in pixel busting and I can use my HD Tek scope and Sony PVM-A250 for color.

In my limited understanding, and please correct me if I'm wrong, I want to set the Resolve timeline setting to 4096_2160 to maintain all the pixel goodness of my original media.  Then because I want to feed UHD to my LG monitor I set the video monitoring to 3840_2160 and set the image scaling to HD.  This works but I don't have my LG at home to test the HDMI output.  The problem is playback performance in the Resolve color tab won't playback at 203.976.  This is with no color correction done at all just a source side LUT var35 to Rec 709.  If I switch to the edit tab I can get 23.976 playback.

If I instead leave the timeline at 4096_2160 and set video monitoring to HD I then get a postage stamp out of the spigots 3 and 4.  I can resolve the postage stamp by setting image scaling to match timeline.  In this configuration I get full playback speed.  But in this configuration given I have set the video monitoring to HD I assume the HDMI out of the DNxIO will now be HD and not UHD.

Can anyone shed some light on the heirarchy of timeline settings, video monitoring and image scaling?  I think perhaps the nature of the DNxIO internal downconvert to spigots 3 and 4 when in dual or single link mode may contribute to the postage stamp behavior.  I have found no parameters in  the Video Desktop software to address this internal downconvert.  I only learned about it by a response to one of my old threads on how to use it with Avid Symphony.

I'm a little surprised at the lower performance in the UHD monitor mode given I have two TitanX GPUs and a GTX-680.  I experiemented with forcing the GPUs to Cuda, Open CL and manually assigning them.  It seemed just leaving the setting to Auto provide as a performance as any other permutation.  I would think with all that GPU setting video monitoring to UHD shouldn't keep it from playing at 23.976.  If the performance issue was based on my source drive throughput I would think that would happen no matter what settings I choose, but given it only happens with UHD monitoring I have to feel like it's a processing issue.  I realize as I add color grade nodes and noise reduction performance will probably suffer until things cache or whatever happens.\

Any suggestions welcome.  I'm in Resolve classes again for the next two weeks but we don't always get into the nitty gritty hardware stuff.

John Moore Barking Trout Productions Studio City, CA bigfish@...

__._,_.___

Posted by: bigfish@pacbell.net
Reply via web post Reply to sender Reply to group Start a New Topic Messages in this topic (3)

Have you tried the highest rated email app?
With 4.5 stars in iTunes, the Yahoo Mail app is the highest rated email app on the market. What are you waiting for? Now you can access all your inboxes (Gmail, Outlook, AOL and more) in one place. Never delete an email again with 1000GB of free cloud storage.

this is the Avid-L2

SPONSORED LINKS
.

__,_._,___

No comments:

Post a Comment