I agree that I see no reason to shoot True 24.0 in my world. I didn't say the rates are because of audio but the pulldown and pullup is based on the record sample rate. I'm just not sure which way they offset it in the field. It's all an unnecessary pain to me in my world but I've read that people would record at 47,952 KHz or 48,048 KHz in the filed to plan for audio pull up or pulldown. I don't know if they recorded at those sample rates and then re timestamp the resulting files or if the files get adjusted because of their off 48K timestamp during ingest. I was hoping someone who deals with those type of work flows might shed some light on it.
This sort of gets back to the original question of the thread. Is there any reason to shoot True 24 instead of 23.976? If I was doing a show that is going to air in the US and overseas with 25fps deliverables would True 24 have any advantage? I can't think of any but I don't do much with standards conversions in the file based world. Those things usually happen at our dubbing facilities.
---In Avid-L2@yahoogroups.com, <cutandcover@...> wrote :
On Wed, Jan 17, 2018 at 4:39 PM, bigfish@... [Avid-L2] <Avid-L2@yahoogroups.com> wrote:I don't have an answer to your question but I've often wondered why people shoot True 24.0 vs 23.976? I live in US so 23.976 is more compatible to my broadcast workflows but film workflows are traditionally True 24. When telecine was involved things going to tape or now file usually get converted to 23.976 in my limited experience with those types of work flows.
I have always been told the whole thing with all the pull ups and pull downs is due to audio. A.K.A. "Low Frequency Video ;-)" The idea of recording at 48.048 KHz or 47.952 IIRC was to manipulate the audio to be in sync. I forget if those off sample rates were then overridden to stamp the files as 48K or if it meant that later the reinterpretation/conversion to 48K would apply the necessary speed change to maintain real time sync with video.
I use to think it was just overcranking or undercranking the recorded sample rate so that when the audio file played back at 48K it would be faster or slower as needed but I think it actually works the opposite way. It seems more like if I use QT Pro7 to convert a 29.97fpe clip to 23.976 the actual running time is maintained but various frames are blended or skipped. Michael Phillips had told me that when importing audio files into Avid something under the hood in QT did something similar to the speed of the audio clip.
I think I was thinking of it in the way I think of the console command "Ignore QT Rate" but instead of frame for frame it would come in sample for sample but I'm pretty sure it doesn't do that.
Perhaps someone could clarify if I shot true 24.0fps video knowing it would ultimately be 23.976 then what sample rate should I record at on the DAW? My way of thinking would be that record at 48.048KHz so that when the file was played back at 48K it would get the necessary slow down, pull down but in asking this question over the years it seems to behave exactly the opposite of how I think. In a way it's just like the new Parker Puppy although I don't have to scoop up piles of audio every day. ;-0
---In Avid-L2@yahoogroups.com, <hoplist@...> wrote :On Jan 17, 2018, at 2:39 AM, Pat Horridge pat@... [Avid-L2] <Avid-L2@yahoogroups.com> wrote:all valid frame ratesAnd 30 and 60 are indeed valid ATSC frame rates as distinct from 29.97 and 59.94.I've spent a lot of years assuming that when people say 30fps, they mean 29.97. In many environments they are used interchangeably. As of now, I've never had to question that assumption. I guess I need to.Are true 30 and 60 used actually used, deliberately and distinctly? When and why?Is there any chance that the industry standard will ever switch from 29.97 to 30? Or to 60fps rather than 59.94?Cheers,tod
Posted by: bigfish@pacbell.net
Reply via web post | • | Reply to sender | • | Reply to group | • | Start a New Topic | • | Messages in this topic (24) |
No comments:
Post a Comment