So back in the day when we had the Ikegami CRT monitors that nobody had at home was that too much? I guess the point is if the Dolby display technology isn't representative of the home viewer's TV is it pointless. I think in general if a grading monitor is accurately representing the image then there is a point. If other monitors have inferior image quality that's on them. It seemed that grading on a CRT was still valid when LCDs came along. As long as I can be consistent in my grade and accurate with a scope and a monitor that reliably displays the content I think advanced display technology is an asset. Of course if the cost is way out of the ball park then from a business stand point it may not make sense. I'm happy with my Sony OLED and it's price point but if someone wanted to buy me a Dolby monitor I'd give it a whirl.
What do other's think about grading on a monitor that is more typical of what the consumer will use? This also falls into the area of is it important to check video on a CRT to make sure newer monitors aren't masking things like interlace issues. I'd appreciate what other people's views are on this.
| Reply via web post | Reply to sender | Reply to group | Start a New Topic | Messages in this topic (5) |
No comments:
Post a Comment