MGrayson
Subscriber and Workshop Member
Good. Because we're stuck with it!I think I can live with diffraction ...
Great to see you here. Join our insightful photographic forum today and start tapping into a huge wealth of photographic knowledge. Completing our simple registration process will allow you to gain access to exclusive content, add your own topics and posts, share your work and connect with other members through your own private inbox! And don’t forget to say hi!
Good. Because we're stuck with it!I think I can live with diffraction ...
If you do lots of clever math under specific conditions, you can use diffraction to store information!And that's the problem with diffraction. It destroys information - it's literally lossy compression.
As the theoretical aperture get smaller, and we get to the level of just one photon, forgive me for asking (I’m not a physicist) but what happens? I kept thinking of light as a wave but does it not behave like a particle too? I was wondering about low light situations…Darn you for opening this Pandora’s box! But the graphics you supplied are wonderful, particularly in 3D.The first and simplest case is where we forgot to put the lens on. We take the shot with a big hole in front of the sensor. What should happen? Waves of light flood in and expose every pixel evenly. This is an actual computer simulation, not just me drawing the answer. Light comes in the top and hits the sensor at the bottom. We're looking down on the space between the lens mount and the sensor. Flange distance is 20mm, Sensor width is 44mm.
As you can see, light is hitting the sensor everywhere evenly. Not a great Star image.
What happens if we close down the aperture?
A circle of light hits the sensor, as we'd expect, but there's some stuff going on out the fringes. Those are the boys jumping up and down at the end of the line. All we've done between the last image and this one is remove the boys from the outer parts of the line.
Great! As the aperture gets smaller, our circle gets smaller. Pinhole cameras here we come!
But as the aperture gets smaller still, the "off the end" stuff gets more important.
If we go even smaller, it gets much worse. Diffraction is just the word for what the missing boys in the line do to our waves.
I promised a few trillionths of a second at a time, but went to a full tenth of a nanosecond here, because nothing much happens other than what you see. When we introduce the lens, things will get radically different because the ends of the lines are closer to the sensor than the middle!
to be continued...
If you do lots of clever math under specific conditions, you can use diffraction to store information!
Yeah, I'm talking about good old-fashioned classical geometric optics. No quantum effects. If we wanted to talk about how lens-coatings work, then we'd have to get into it. But everything I've been talking about has been known since the early 1800's.As the theoretical aperture get smaller, and we get to the level of just one photon, forgive me for asking (I’m not a physicist) but what happens? I kept thinking of light as a wave but does it not behave like a particle too? I was wondering about low light situations…Darn you for opening this Pandora’s box! But the graphics you supplied are wonderful, particularly in 3D.
Then I began thinking of that lonely photon and did a search found this (unrelated, I know), but in the spirit of experimentation and the development of ultra-low light photography I thought that I would share:
Quantum camera takes images of objects that haven’t been hit by light
A device uses quantum effects to create images of objects from light that never actually touched themwww.newscientist.com
I will stop now
Which is kind of neat.But everything I've been talking about has been known since the early 1800's.
Those pesky telescopes!Which is kind of neat.
(Imagine that our ancestors were already worrying about defraction in anticipation of the invention of photography decades later!)
I wonder if one f/22 frame could help with stacking artifacts - When two branches cross at different depths, there is no image in the stack showing the more distant branch's detail where it nearly crosses behind the front one. You could use some of the diffraction-degraded image for that small region.I do everything I can to avoid diffraction which includes stacking if needed and can be utilized. Once diffraction takes its toll on an image the sharpness can never be brought back. You may be able to trick the eye but the image will never be as good as if diffraction was minimized.
Victor B.
Mathematicians are not interested in the “real” world!Matt,
This all dovetailing nicely for me; a week ago, I picked up a copy of Feynman's QED and I'm now half way through. I have little arrows drawn on sheets of paper strewn about everywhere!
Dave
That is a very good idea!I wonder if one f/22 frame could help with stacking artifacts - When two branches cross at different depths, there is no image in the stack showing the more distant branch's detail where it nearly crosses behind the front one. You could use some of the diffraction-degraded image for that small region.
Matt