The GetDPI Photography Forum

Great to see you here. Join our insightful photographic forum today and start tapping into a huge wealth of photographic knowledge. Completing our simple registration process will allow you to gain access to exclusive content, add your own topics and posts, share your work and connect with other members through your own private inbox! And don’t forget to say hi!

M-M Higher resolution, Really? Thinking this through a bit...

mjm6

Member
I have an M9, so I have no reason to be pouty about the new M-M at all, and I love the idea, except that I am having trouble with a few details regarding whether this camera makes sense to most people (which it clearly does not).

The biggest one is the claim of higher resolution. We all know that the 100% higher resolution that Leica is claiming is nonsense (why do they do that, I wonder?). I could see 15% better, some people have said 30% better, but it's going to be in that range, not 100%.

I get that the Bayer filter interpolation in the raw file will reduce resolution, and that each pixel can be taken at it's own site without requiring any manipulation.

However, does that actually happen? Can Leica actually bypass the interpolation in the software? If the camera outputs DGN files and the image is then rendered in a regular program like Lightroom, then doesn't the software apply a Bayer interpolation to the file regardless?

If the files are TIFF or something similar, I could see the claim for improved resolution, but unless there is a way to skip the interpolation in the software, how is it possible to produce a higher resolution image? Since every RAW file requires interpolation for color images, I can't imagine that the rendering engine interpolation can be turned off.

Does anyone have any insight into this?


---Michael
 

jonoslack

Active member
Hi Michael
I'm not really a techie, but I'll try and answer

I have an M9, so I have no reason to be pouty about the new M-M at all, and I love the idea, except that I am having trouble with a few details regarding whether this camera makes sense to most people (which it clearly does not).

The biggest one is the claim of higher resolution. We all know that the 100% higher resolution that Leica is claiming is nonsense (why do they do that, I wonder?). I could see 15% better, some people have said 30% better, but it's going to be in that range, not 100%.l
I haven't seen Leica make the 100% claim - in the old days of the Kodak 760m there were claims of a 400% improvement.

Personally I think it depends from scene to scene and ISO to ISO, but it's certainly there . . . although whether it's relevant is quite another matter.

putting a percentage on it seems rash!


I get that the Bayer filter interpolation in the raw file will reduce resolution, and that each pixel can be taken at it's own site without requiring any manipulation.

However, does that actually happen? Can Leica actually bypass the interpolation in the software? If the camera outputs DGN files and the image is then rendered in a regular program like Lightroom, then doesn't the software apply a Bayer interpolation to the file regardless?
No it doesn't apply Bayer interpolation - there is a new DNG format for monochrome files, and there is no Bayer interpolation. This is why the DNG files can't yet be read in most other software packages
If the files are TIFF or something similar, I could see the claim for improved resolution, but unless there is a way to skip the interpolation in the software, how is it possible to produce a higher resolution image? Since every RAW file requires interpolation for color images, I can't imagine that the rendering engine interpolation can be turned off.

Does anyone have any insight into this?
Absolutely - it's turned off - no question.

I hope this helps
all the best
 
V

Vivek

Guest
Michael, It is M9M. It still outputs 18MP resolution in RGB.

Tonality should be better (and it looks quite clearly so).

100% better resolution?! I have no clue how they arrived at that number either. :confused:
 
V

Vivek

Guest
No it doesn't apply Bayer interpolation - there is a new DNG format for monochrome files, and there is no Bayer interpolation.


Absolutely - it's turned off - no question.
Jono, How do you get RBG output and not greyscale one?
 

etrigan63

Active member
It's not that there are more pixels, folks. It's that in a color image (M9. M9-P) 25% of the pixels are red, 25% are blue, and 50% are green. So there is physically only 4.5 mpx available. Software interpolation fills in the gaps and brings the count back up to 18 mpx.

The M9-M has no Bayer filter. A pixel is a pixel. No color channels. Only brightness. This is why there is no headroom on the bright end and plenty of legroom in the dark end.
 

mjm6

Member
Jono,

OK, so that's the key, a new uninterpolated DNG file. Makes sense...

The 100% claim is on the Leica website:

Incomparably sharp

With a full native resolution of 18 megapixels, the Leica M Monochrom delivers 100% sharper images than with colour sensors. As its sensor does not ‘see' colours, every pixel records true luminance values - as a result, it delivers a ‘true' black-and-white image. The combination of the brilliant imaging qualities of Leica M-Lenses and the image sensor results in images with outstanding sharpness and natural brilliance.
I read 100% sharper to mean it can resolve to 1/2 the smallest lppm at a given contrast compared to the color camera. The 400% you mention is probably people using the 'area method' of resolution definition rather than the 'linear method' (incorrect, but commonly applied by many people).


I am mostly a B&W shooter, and still have difficulty seeing this camera make sense (timing, etc.) unless, and this is what I most fear, there will be no traditional RF style M10, and Leica sees the 18MP camera as the end of the line for this camera chassis.

Otherwise, why not wait until the next camera and sensor to introduce this varient as the camera is probably close to the end of the product cycle now (note the hermes versions, etc.).
 
V

Vivek

Guest
I am positive that it is interpolated file. Jono is wrong, I am afraid.
 

Brian S

New member
Take a picture of a blue-striped or red-striped object with the M9 and with the M9M. Let me know how it works out.

There is no reason to interpolate the output of a monochrome sensor.

I can put up my FORTRAN program for the KAF-1600 used in the DCS200ir. No interpolation. I did not put my spline interpolation routine into it.

If the Leica DNG convertor performs interpolation, it would be easy enough to do a raw processor without it.

I can believe that a non-uniformity correction is done for the output, but this would correct intensity from individual pixels and does not perform spatial processing. the DCS460 had non-uniformity correction done outside of the camera. I have not written code for digital cameras in almost 15 years. Of course, it has almost been that long since Monochrome digital cameras were on the market in any number. I read that there were only Two DCS760m's made. By comparison, Kodak told me that they were going to make 50 DCS200ir's when I bought mine.
 

jonoslack

Active member
I am positive that it is interpolated file. Jono is wrong, I am afraid.
HI Vivek
It's not what I understood -especially with reference to the modified DNG standard - but I certainly understand that you understand these things better than I do! Brian, however, does understand them - so I'll leave you to fight it out . . . . and go to bed knowing that, for whatever reason, the MM files are different and do offer more resolution.

However, personally, I'm not even sure what 100% increase in resolution means!
 

mjm6

Member
It's not that there are more pixels, folks. It's that in a color image (M9. M9-P) 25% of the pixels are red, 25% are blue, and 50% are green. So there is physically only 4.5 mpx available. Software interpolation fills in the gaps and brings the count back up to 18 mpx.
This isn't really correct either...

Each photosite has a pixel, but the actual RGB information for each photosite is a combination of that pixel, and the mathematical prediction of the valuse for that pixel based on the sites surrounding it.

This is partly why I don't think the 100% claim is correct, because the information is never a 100% interpolation for a photosite (there are no site gaps that the software fills in, but color gaps in every site) , but every site is an interpolation.

---Michael
 
V

Vivek

Guest
There is no reason to interpolate the output of a monochrome sensor.
There is a reason. In fact, two. I did post this earlier in Jono's thread.

1. Price. If it had been a true monochrome camera, it would have been at least 3X the price.

2. Use of Nik pluggin and digital filters after the capture.
 
V

Vivek

Guest
HI Vivek
It's not what I understood -especially with reference to the modified DNG standard - but I certainly understand that you understand these things better than I do! Brian, however, does understand them - so I'll leave you to fight it out . . . . and go to bed knowing that, for whatever reason, the MM files are different and do offer more resolution.

However, personally, I'm not even sure what 100% increase in resolution means!
Hi Jono, See my post above ( I already posted these earlier- nothing new).

I could catch this immediately because of my own on going projects.

Leica should put out a vimeo on this, I think. :)
 

mjm6

Member
There is a reason. In fact, two. I did post this earlier in Jono's thread.

1. Price. If it had been a true monochrome camera, it would have been at least 3X the price.

2. Use of Nik pluggin and digital filters after the capture.
Vivek,

I think you may not understand how Bayer filters work (and please don't think I am talking down to you at all), or possibly I am not understanding what you said, but the Bayer filer on a camera is there to enable a sensor to see in color. Otherwise, a sensor is inherently a B&W device that happens to have a particular sensitivity curve to it (and there is a lot of discussion about this as well on the various fora).

One way to make a camera see in color is to have three (or four if you want to do RGBK) photo sites with filtration for each actual pixel in the resultant image file. The problem with this is that your final file resolution is severely hampered by this approach.

Another way to do it is to use the Bayer approach (a brilliant method, really), whereby each photosite only collects one RGB value, and then the neighboring sites color values are used to provide the rest of the information through a formula. This approach does not need 3x or 4x sites.

Old cathode ray TV's used essentially the first method to work, as an example.

Once the file in in the computer, the software triples the information into an RGB color file so that the NIK software can use it, and apply toning and other effects to the image.

I do this all the time with B&W TIFF scans from B&W film when using the NIK software.


---Michael
 
V

Vivek

Guest
Michael, I do understand how these work. :)

I posted maxmax' site earlier. They do debayering of (2 models) canon DSLRs.

I am working on my own (Sony sensors) for my own use since there is no commercial out fit that would do that for me.

Another way to do it is to use the Bayer approach (a brilliant method, really), whereby each photosite only collects one RGB value, and then the neighboring sites color values are used to provide the rest of the information through a formula. This approach does not need 3x or 4x sites.
I am afraid you are wrong with this understanding. There is a physical (color) filter in the Bayer array.
 

mjm6

Member
OK, then I am not understanding why you feel the camera would cost 3x the price...

You leave the Bayer filter off, change the file processing a bit, and you suddenly have an image file that is the bit depth of the sensor (12 bit for the Kodak sensor IIRC), and the resolution of the sensor array.

---Michael
 

Brian S

New member
Vivek is wrong. all there is to it. The CCD used in the M9M is a monochrome detector, no bayer pattern mosaic filter over it. I had a DCS200c and a DCS200ir. I processed the output of the monochrome-IR detector with my own software, and there was no need for interpolation. With the DCS200c, interpolation for the Bayer pattern mosaic filter was performed in the Kodak supplied drivers.

I had Kodak make the IR version of the DCS200 20 years ago, talked with their engineers then. It cost an extra $4K over the standard $8,400 of the DCS200. Kodak did a run of 50 IR detectors. And about a year ago, called them up to ask what it would take to do a monochrome version of the KAF-18500 as a replacement for the one in the M9. They told me a run of about 50 detectors would make it worthwhile. I do not see where the 3x cost comes in. It did not happen 20 years ago, and things were expensive then.

making a Monochrome Detector is just like making a Bacon and Tomato sandwich, leave out the lettuce and the color dye in the mosaic later. But making detectors was 30 years ago for me. We used detectors with different spectral response for multiple-color sensors, no mosaic pattern. But with Mid wave and Long-wave Infrared, much easier. The optics were expensive.
 

docmoore

Subscriber and Workshop Member
Now I am confused...

You state Leica says SHARP and you say RESOLVE....two different things...

Bob
 
V

Vivek

Guest
OK, then I am not understanding why you feel the camera would cost 3x the price...

You leave the Bayer filter off, change the file processing a bit, and you suddenly have an image file that is the bit depth of the sensor (12 bit for the Kodak sensor IIRC), and the resolution of the sensor array.

---Michael
QC issues. The Achromatic+ digital back is expensive because they have to pick a "flawless" sensor (1 in 10 or so?) Pixel mapping and such can not be done there to mask the flaws.

Vivek is wrong. all there is to it.

Yes, it is a monochrome sensor with no Bayer dyes but there is interpolation going on to get a RGB output. I will even categorically say that it is a M9 sensor without the Bayer dyes.

I am not going to post anymore on this. But, eventually the truth will come out.:)
 

mjm6

Member
Bob,

Do you understand how optics systems are defined for performance testing? 'Sharpness' isn't really defined to my understanding, but 'resolution' is a definable term.

Sharpness in a print is often considered a combination of resolution and acutance, and is more about perception, not about performance. Hence the purpose for unsharp masks, for example.

At least that's how I've always interpreted when people use the term sharpness.

Look here:
Understanding resolution and MTF


---Michael
 
Top