John, the 135mm 3.4 apo telyt is a very good lens. A little more expensive at around $1500 used, but still a very nice lens.
First you need to have sorted out the whole diopter magnifier question. The 1.4X is a big improvement but the 1.25x works. The proper diopter is critical because even a slight focus error results in a total blur. I stack the 1.4 and 1.25 when testing for calibration. But once its calibrated the 1.25x works fine.
The second issue is the calibration....its easy to find a 135apo that focuses close accurately but as you move toward infinity it can go out quick. Tough one to crack because its on at 10ft ,front by 2 ft at 20ft and pure blur at infinity. My street in FL has mail boxes every say 100ft for a quarter mile . Great test location. Took DAG 3 tries and he needed my M8 even though he couldn t find anything different from his test M8 ...he just used my M8 as the standard and adjusted the lens to my camera.
The reason I worked so long on this ...was that photography around water needs some reach . A 180FOV is often needed just to frame the scene . having a 135 allows me to work just with my M8s otherwise its a DSLR. The apo is a superb lens and I like the rendering as well as any of the M lenses.
Roger, I'll hold off on a 135mm lens until we know whether or not the M9 is real. If the M9 is real, then a 135mm lens is the least of my worries BTW - I tried to look at your site, but the URL doesn't seem to be correct (in your signature).
The mystery behind haveing no AA-filter is simply, that this is only possible (and sensefull) with a full-frame CCD, where the active pixels (pixelareas) are almost contiguous.Or is it possibly that AA filters have their place? I know, I shouldn't have said it, but I hang around in the gutters and dark alleys of the photography world, where twisted souls who have turned their back on the Gospel of Solms whisper perverted heresies of this sort... I was going to enumerate a few, but was afraid of shocking the kiddies...
Which means there are only very small gaps between the active pixelareas.
On the other side, interline CCD and CMOS (and also LiveMOS) sensors have active pixelareas less than the theoretical area from calculation from the 'pixelpitch'. So the otherwise lost image information from these gaps has to be added to the neighbouring active pixels, which is best (?) done by an AA-filter.
There are some common (???) accepted rules for AA-filtering:
1) without AA-filter one can achieve the highest possible image quality (resolution, noise, colorsaturation and so on)
2) without AA-filter, only full-frame CCDs could be used, therefore LiveView or even video are out of scope.
3) the result from (2) is also, that a compareable (to CMOS) high ISO sensitivity could only be reached with hardware/sensor pixel pinning (to a lower resolution with adequate lowest noise).
Moireé is another chapter of this story. An AA-filter can - in the best condition - only minimize it to a certain degree, but never remove it completely. In case of no AA-filter, moireé can be easily minimized and even removed in a high degree by software algorithm.
The differences you mentioned regarding CMOS/CCD/Full-Frame-CCD and pixel-pitch are very important and true.
But these "gaps" are handled by microlenses, which focus the light on the light-sensitive-area and not by the AA-filter. CMOS are hardly usable without microlenses while some full-frame CCDs don't have microlenses (like most MFDBs) and still loose abut 1 stop effective sensitivity (like P30+/H3DII-31 vs. P45+/H3DII-39) but making them less problematic with movements/oblique light rays (real WA-lenses).
As far as I know, the AA-filter doesn't affect noise or colors, it's not much more than a piece of glass which slightly blurs the image like satin glass.
This has nothing to do with CMOS vs. CCD, CMOS-sensors are easy to manufacture (in existing fabs) and can implement certain post-processing (less additional components). CCDs are more expensive to manufacture and they're just converters from light->electricity (photodiodes), they have the highest fill-rate but don't offer any kind of internal processing, even the amplification and AD-conversion is done be separate ICs.
The low-noise of todays CMOS-sensors is reached with noise-filtering which may affect certain other IQ-aspects (like microcontrast and color) - never compare noise of CCD and CMOS directly in RAW "without any noise reduction", because the CMOS-RAW already incorporates heavy post-processing. Test them with fine details (denoise uniform areas is simple) and process both RAWs with noise-tools.
Apparently there is a big dealer meeting in NY this weekend by Leica so something is up.
what is the S2?
As long as it's not on 9/11.