If you work with preset color gains, a fixed color matrix is chosen. Normally, libcamera itself estimates a color temperature and calculates from its estimate the appropriate color matrix. Typically, the tuning file contains more than the two color matrices stored in the. The jpgs used in the exposure fusion are created by libcamera based on the tuning file used for this camera. The Mertens-path (exposure fusion) is quite different. In the end, both shadow and highlight areas feature a reduced contrast, compared to the mid-tone range, and that ensures that you still see image features in these areas without highlights buring out or shadows drowning in the dark. The second stage in the development of the raw image is the intensity mapping - both of you employ a noticably S-shaped curve by setting the highlight and shadow values appropriately. These should be fine for all practical purposes. I did not follow the latest steps of the libcamera development too closely, but I think the two color matrices embedded in a HQ camera raw are taken from data/experiments of Jack Hogan. Once you pick a whitebalance (and a corresponding color temperature) the actual color matrix used during development is interpolated from the two color matrices embedded in the. dng-file features at least two color matrices for two different color temperatures. You transform the raw color values into “real” color values with camera-specific color matrices. The development of the raw image is a two-stage process. Lowering the exposure duration leads to a little bit of quantization noise in dark image areas however, as the noise caused by film grain is much stronger, nobody will notice. You want to have the safety margin as small as possible, because every stop you are underexposing leads to missing/noisy data in the shadow regions. In fact, the exposure values chosen in your examples take care of that. You need to make sure that the highlights do not burn out (cmp this old post here). dng file of the example frame you showed in your post above, and we could have a look at what it looks like with my Lightroom preset applied … if it makes any difference at guys, your experiments are very interesting!Ī raw capture needs a finely-tuned exposure. That said, if you are willing, maybe you can send me the. Maybe someone else has perceived something similar at some point? However, my personal feeling was always that Adobe has a huge amount of experience and expertise in their software, and that as a result Lightroom might just be more refined than other options. Now, this could very well be the result of my having years of experience using Lightroom and very little experience using anything else. In both cases, I was never as satisfied with the results as I was with what I could get from Lightroom. In addition, I actually tried at some point to base the post-processing for my film scans on the rawpy Python package. In particular, I have at some point in the past processed some RAW files of usual digital still photos in Affinity Photo. Regarding RAW processing mastery: In my personal experience, I have found Lightroom to produce somehow slightly better results than other RAW processors.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |