The GetDPI Photography Forum

Great to see you here. Join our insightful photographic forum today and start tapping into a huge wealth of photographic knowledge. Completing our simple registration process will allow you to gain access to exclusive content, add your own topics and posts, share your work and connect with other members through your own private inbox! And don’t forget to say hi!

DT Tech Cam Test - IQ250 vs IQ260 vs IQ280

torger

Active member
Okay, I've studied the problem a bit more with my own algorithms and read up a bit on sensor technology. My current theory, and I think it's right this time, is that the desaturation is due to pixel crosstalk. Desaturation is a known artifact of pixel crosstalk.

What is pixel crosstalk? The angle of incoming light is so low that when it for example passes the the red filter it crosses over to the next pixel and gets registered in the green photodiode. This means that the photon counts between R G B evens out, ie you get desaturation, and slight blurring of details (but the blurring is not very strong, may be less than blurring due to limited lens performance).

Pixel crosstalk cannot be corrected by LCC, at least not the algorithms seen in current software. So when you see desaturation, it's game over.

Since the desaturation increases gradually, it is very subjective how usable the IQ250 is with these lenses.

At this point my opinion is that one should really be careful with IQ250 and wides, it's hard from these tests to know how it's actually performing. What I would do is to make a color checker experiment, first have it in the center and then shoot the same color checker on a strongly shifted lens and see how the colors differ after LCC has been applied. Hmm... I might do this experiment with my Aptus 75 and 35XL...
 

jlm

Workshop Member
took a quick look at the 250 at Digital Transitions, (courtesy of Doug and Lance) and would comment, from the point of view of a tech camera user:

the live view is just what you want for composing and setting focus, esp. combined with the IQ quality and image magnification. it does not measure any sort of exposure or provide any sort of exposure information (no histogram, it auto-corrects it's own exposure), so from the point of view of exposure, it is not WYSISYG. this is not a problem, nor is the delay, from my perspective. you do have the ability to live view at whatever aperture you want (unlike GG, which needs to be wide open or is too dim)

didn't try a tilt/focus experiment, but i have every reason to believe the 250 live view will offer a superior method which may be it's very strongest point

the crop factor means you will want a still shorter FL lens; the wide angle pixel issue will affect how wide you can go and how much you can shift. one can always go vertical and stitch, (subject to the same limitations)

high iso is not my bag, but from what examples I saw, 800, 1600 are perfectly fine in print, and you could go more, depending on your own demands.

have to say, I wanted one
 

tjv

Active member
There is a post on LL that shows an extreme amount of hot pixels (or something that looks like hot pixels) on the IQ260 shots with NR turned off. I was really shocked to see them! Can anyone comment on if this is normal or not?
 

Paul2660

Well-known member
Actually noise reduction was on with normal Capture One defaults in images I posted. Even taking single pixel NR to 100% did not get rid of all of them.

I was a bit surprised by this as I believed the manditory dark frame taken by the camera would have factored them out. Also as exposure was 6 seconds as I recall and that many stuck pixels was a bit excessive.

The only other CCD camera I worked long exposures with was the P45+ and it was never this bad with stuck pixels.

What I don't know was was the temp in the room. It may have been warm and caused some excess.

Comparing it to the 250 either exposure there are no stuck pixels showing I could see.

Paul C
 

torger

Active member
Hmm... I might do this experiment with my Aptus 75 and 35XL...
Did a quick one with a color checker in the center and then at the image circle border (90mm) with the 35XL and aptus 75. I used a center filter so the vingetting was essentially zero.

Similar effect is seen, but to a lesser extent. There's pixel crosstalk which reduces saturation of the colors, and G1/G2 difference that causes demosaicing artifacts (mazing), unless a two-greens-separate algorithm is used like VNG-4 (possible choice in rawtherapee, not in Capture One afaik). I have no screenshots to show now, might do it later if anyone is interested.

Anyway, I think it's possible to make some sort of correction of this but it's not easy, and Capture One's LCC does not do it for sure.

When going for an IQ250 I'd suggest that you borrow/rent and do this simple experiment with the widest wide you intend to use:

Shoot a color checker in the center, turn and shift the camera and shoot the color checker at the image circle border (or any other limit you choose), apply LCC and compare visually. Do you see mazing in the demosaicing? Is the color difference acceptable to you or not?
 

jlm

Workshop Member
one thing i find a bit odd...seems the main feature of the CMOS is the ability to use live view, yet that is primarily of use for tech cameras, not DSLR work, unless tethered, i suppose.
also the lack of exposure information/histogram in the live view seems an oversight that would have suit the tech camera user quite well.
 

Paul2660

Well-known member
one thing i find a bit odd...seems the main feature of the CMOS is the ability to use live view, yet that is primarily of use for tech cameras, not DSLR work, unless tethered, i suppose.
also the lack of exposure information/histogram in the live view seems an oversight that would have suit the tech camera user quite well.
Good point and may come with a firmware update later on or with the next model.

Paul C
 

Nebster

Member
one thing i find a bit odd...seems the main feature of the CMOS is the ability to use live view, yet that is primarily of use for tech cameras, not DSLR work, unless tethered, i suppose.
I shoot a lot of static, planar subjects. I find I can only nail focus with live view. Viewfinder focusing delivers peak detail less than half the time.

(Actually, what I would really like is a live view mode that goes to 100% at the center and all four corners, all at the same time. I just need a little slice of each to verify that everything is square and in focus. Today I go hunting across the frame to check that focus, which takes ten times as long.)
 

jlm

Workshop Member
interesting you mention that; as a tech camera tilt user, that would be extremely useful

listening, Phase?
 

GrahamWelland

Subscriber & Workshop Member
And when it comes to the exposure display, how about a view of the histogram that has greater magnification at the highlight/shadow ends - we don't normally care so much about the middle. But make it an option.

Live View with technical cameras - now that really will be the game changer. I'm using my GG more these days for tilt but dust clean up and taking the back on & off is a bitch.
 

dchew

Well-known member
Actually, when zoomed in 100%, they could add a feature that allows us to do something like double tap one of the four existing buttons (or hold down for 1sec) and that sends you to the corresponding corner of the image. Not as good as seeing all four at once, but it could be implemented with a simple firmware update to all of our existing backs.

Dave
 
Last edited:

Steve C

Member
I hope Phase will listen to Tech Cam users more. Having two simultaneous selectable magnified screen areas would allow watching near and far focus points as you apply tilt. That would save so much time compared to zooming in, check focus, then out, selecting a new spot, zooming in, tilt, check focus, zoom out, repeat, repeat, repeat. Now that high quality CMOS Live View is becoming a reality, tech camera operation could be greatly simplified and achieve much more accurate results.
 

GrahamWelland

Subscriber & Workshop Member
I hope Phase will listen to Tech Cam users more. Having two simultaneous selectable magnified screen areas would allow watching near and far focus points as you apply tilt.
That would be AWESOME in real life. I'd upgrade to a back that could do that.
 

f8orbust

Active member
These are all great ideas - and things that should be doable in software fairly easily. CMOS really does open up a whole new world for the non-DSLR MFDB user, I just hope the next iteration is more wide-angle friendly. One final thing - again which should be available in software - a real-time focus mask. Imagine tilting the lens and watching the location and extent of the plane / zone of focus in real-time across the whole image. Er, yes please.
 

torger

Active member
I've been using these images to tune the LCC algorithm in Lumariver HDR. I'm not done yet but I've made progress and can report a bit about the IQ250 performance. I use the IQ250 + 32HR raws for the testing.

This combination has the worst color cast I've seen, and therefore my algorithm needed to be improved to make the best possible out of it.

The result Doug has succeeded getting is surprisingly good, so good that one can believe that the IQ250 works better than it does for the wides.

For the 32HR only within the 50mm image circle the back is well-behaved. Then we start to get green channel separation (G1 != G2) and crosstalk. Crosstalk affects color fidelity and green channel separation affects demosaicing precision. In theory the green channel separation should be cancelled out with the LCC, problem is that the main shot and the LCC does not have the same separation, I'd guess because it's a phenomenom that differs depending on exposure, ie the LCC is a bit darker and has less separation, meaning that after LCC has been applied there's still some separation left and demosaicing suffers. The difference between G1 and G2 is about 1/3 stop as most. There are tricks to work around this and I will probably implement that in Lumariver HDR, but the crosstalk is harder to fix. Actually it's impossible but you could hide some of the effect with post-processing techniques. As crosstalk seem to vary over the sensor surface and you can't really know how much crosstalk you have in a certain position it has to become manual guesswork in photoshop.

Crosstalk means that colors channels are mixed (see attached image) and you get reduced saturation and color precision. If you look at the IQ260 jpeg and compare it to the IQ250 jpeg you'll see that straight ahead the color is quite close between the backs. But if you look above the front window you start to see substantial differences in color. The color does not necessarily look bad, but you have a shift and it's not correct. Further out it's not just a shift the desaturation becomes obvious though.

There are no raws for the 40HR or 60XL, but based on the JPEGs I'd say that there's large problems there too of this type, and it matches the theory the angles are not much smaller for these than for the 32HR (less retrofocus on the 40 and 60). As soon you start to shift you'll get issues.

When sensors are designed they are designed with a maximum angle before crosstalk occurs. To avoid crosstalk you put a mask around each pixel, this mask reduces the fill-factor though so you don't want to have it too large. If I remember correctly the KAF-39000 is designed for 39 degrees max angle before crosstalk. The Rodenstock Digaron-W series is probably designed with some sensor design in mind. As retrofocus gives bad properties (more distortion, harder to make sharp) you want as little as possible, only as much to avoid crosstalk for the target sensors. I don't know what design target the Digaron-W series had, but Phase One can probably find out the angle the Sony sensor is designed for and also what the Digaron-W is designed for so it would be quite easy for them to find out what should work per design.

With post-processing and good LCC algorithms you stretch it a bit more, ie make it work okay despite some crosstalk, but if you invest in a $35K back I think you as a buyer has a right to know which lenses that actually work by design and those which don't.

As far as I can tell none of the lenses used in Doug's test work well. This sensor requires strong retrofocus lens designs for wide to normal lenses. Still you can get images out of it as this test shows, but tech cameras are intended to excel in image quality and if you start to get color fidelity issues at 5mm shift that's not my definition of high image quality, ie you need both color and resolution to work.

To make a tech camera friendly CMOS sensor with current manufacturing techniques you need to enlarge pixels and increase the light shield to reduce pixel vignetting and crosstalk (crosstalk is the larger problem here, vignetting is not too bad especially when we have this much DR to dig from and the tests didn't use a center filter). Sony could probably do this quite easily if they wanted. With backside illumination and stacking you can make a sensor where the vertical distance between the photodiode and the color filter is much shorter and thus you can handle much larger angles without pixel vignetting and crosstalk. Sony has these manufacturing techniques but the largest sensor possible to make with this so far is a 1" sensor afaik. It remains to be seen if sensor design will adapt to current tech cam lens design, or if tech cam lens design will adapt to current sensor design.
 
Last edited:

gerald.d

Well-known member
Fascinating analysis torger.

It's interesting to note that - to the best of my knowledge - we are still to see a single example of the 23HR with this sensor.

I'm assuming there's a reason for that, which your analysis may go someway in explaining...

Kind regards,

Gerald.
 

dougpeterson

Workshop Member
Gerald,

No conspiracy theory required regarding not having 23HR shots posted. We are more than glad to show which lenses work and which don't. See for instance the 35XL which, frankly, does very poorly with movements on the 250 and 180 and are still shown in the test.

The reason we don't have a 23HR test posted is more practical: we don't have one in our demo/rental inventory.

If anyone in the NYC area has one an wants to set up an appoinment I'll be very glad to test.

I actually think it will hold up fine, including with some minor movement; but only a real world test would say for sure!
 

dougpeterson

Workshop Member
Torger, I really don't agree with the pessimism of your analysis.

You say "none of the lenses" tested work well with the 250 and I just can't possibly understand how you come to that conclusion. Are there color issues on the outside of the image circle? Yes. But within the usable image circles the color and tonality and detail are really good.

Perhaps you should 1) request the raws of the 40 and 60 (you say "there are no raws of the 40/60" but in the article I clearly state we have them and are glad to provide them; we only opted not to post hard links to all 100 raws from the test out of concerns we'd overuse our storage account.

Also, as 99.9% of our customers are using capture one to work up their phase/tech files you might consider doing your analysis there, rather than inventing your own algorithms.

Of course you're free to come to your own conclusion - that's why I post the raw files and a test this complete.
 

torger

Active member
Doug,

you don't need to be so defensive, no criticism intended. I'm very grateful that you've posted this that we can see and use for evaluations. I know I can request the additional files from you (I should have been clearer about that), and I probably will at some point as they will be very good input to algorithms. Just now I haven't, simply because there was no direct link to casually download them and I have not strictly needed more test material yet. I've put in a lot of hours in this analysis so far. When finished my algorithm will most likely outperform Capture One's current which will be good for everyone. Capture One isn't necessarily best at everything you know ;-). Lumariver HDR supports loading a raw file, apply LCC and export as a cooked raw DNG so one can further process in any raw converter with DNG support.

My analysis is based on what's actually there in the raw files, and compared to other digital backs and what is in their raw files. I see bad behavior as described in my analysis, and that it should primarily affect color fidelity very early on and demosaicing issues further out (mazing).

To me it's not acceptable that color crosstalk starts occuring with very small shifts, and I have very strong indications that it's happening. Sure you can shoot a scene where color shift and desaturation is less important, but when I buy into a tech camera system I want very high image quality, and that includes color fidelity. I think many other customers also care about color fidelity, tech cameras is more than just wide angle images with sharp corners.

A test which would make it clearer is to shoot a color checker, first in the center, then further and further out and see how color reproduction change. If "usable image circle" is defined as the image circle where color is the same as in the center (ie no crosstalk) I'm quite sure it will be considerably smaller than the image circles you've stated so far. And that it's exactly the definition I've used in my analysis. You may think it's too strict, that's fine by me, but I would guess there are others that would prefer to use this definition too.

I think it's wise to be a bit pessimistic until these color issues have been thoroughly investigated. As it's impossible to see what's crosstalk and what's just pixel vignetting in raw analysis of a LCC shot one have to make a color checker test. I have very strong reasons to believe there is crosstalk early on though, as green channel separation and crosstalk happens about the same time on other digital backs. I also see color differences in the jpeg stitches that indicate the same.

If I was selling these high end systems I would for sure investigate this property. The thing is that a customer may very well make a mistake when buying into it. Detecting a color shift is not too easy especially if you don't know that you should look for it. Say if you rent it a couple of days and you don't know what to look for you may actually miss that the system can't reproduce colors right when shifted, and you buy into this back for doing your interior shots and only a month later or so you realize that color fidelity is not acceptable. I would then be disappointed if the dealer had not informed me.

You can't just shoot a test scene and "well this looks kind of good, then it must work". The sensor has been designed for accepting a certain max angle before crosstalk occurs, and the lenses have been designed to deliver a certain max angle. If I was buying into this system I would surely want to know from the seller how large image circles that are crosstalk free.

These library shot gives indications that crosstalk happens early, but a color checker test or other method is required to give an appropriate answer. With your contacts in the industry you can probably get the design targets in terms of angles for both the sensor and lenses too, which is hard for me to get, as they're not available in the public data sheets.

I'm not here to try to trashtalk the IQ250. But I'm not here to raise it to the skies and try to get people buy it either, I'm just sharing my analysis based on my special expertise so potential buyers know what to look for.

Torger, I really don't agree with the pessimism of your analysis.

You say "none of the lenses" tested work well with the 250 and I just can't possibly understand how you come to that conclusion. Are there color issues on the outside of the image circle? Yes. But within the usable image circles the color and tonality and detail are really good.

Perhaps you should 1) request the raws of the 40 and 60 (you say "there are no raws of the 40/60" but in the article I clearly state we have them and are glad to provide them; we only opted not to post hard links to all 100 raws from the test out of concerns we'd overuse our storage account.

Also, as 99.9% of our customers are using capture one to work up their phase/tech files you might consider doing your analysis there, rather than inventing your own algorithms.

Of course you're free to come to your own conclusion - that's why I post the raw files and a test this complete.
 

torger

Active member
Actually, before I started to analyze the IQ250 I didn't even know that pixel crosstalk could be a real problem. And it's true that crosstalk has not been a big problem previously (although I do see it on my Aptus 75 and SK35XL for extreme shifts now when I've learnt what to look for) as CCDs have had quite high critical crosstalk angles with some notable exceptions.

Problem is that this Sony sensor seems to have a rather low critical crosstalk angle, so crosstalk becomes an issue to look into, and this is a "new" problem which few users are aware of. I don't think many dealers know about it either, as tests would have been devised otherwise if so.

And perhaps most important -- unlike color casts (comes from angular response variations) crosstalk cannot be corrected with an LCC shot.

There still is a chance that the crosstalk-free image circle is larger than I think it is, as I cannot say anything for sure based on the current test material. I just have indications.
 
Top