I tried that app, and it was pretty good but had some problems. I’m not sure but it seemed it was capturing the images then doing the stack in the camera, so basically same thing you would do on a computer. From the description of this feature it is functioning differently where the stacking is done before he A/D converter. Not sure. Images I captured with my Sony were nice, but they were visually al little different than the same image with and ND filter. That might very well be true of this IQ4150 well. All I know is the day my new back arrives, I’m headed to either Oregon for waterfalls or some where on a coast (good excuse to go to Hawaii
)
And of course a 42mp Sony would require substantially less computing power to handle the data as compared to 100 or 150 mp sensor.
This concept has been around for a while, I don’t think Phase waited to implement it just so they could help drive sales of a future back. Something in the tech of the current back makes it possible. I still think part of it is the computing power, and the article seems to indicate this as well. But Christopher makes a good point, obviously there’s something in the 150 sensor tech that makes it possible as well.