The GetDPI Photography Forum

Great to see you here. Join our insightful photographic forum today and start tapping into a huge wealth of photographic knowledge. Completing our simple registration process will allow you to gain access to exclusive content, add your own topics and posts, share your work and connect with other members through your own private inbox! And don’t forget to say hi!

D300 12 bit vs 14 bit

clay stewart

New member
I haven't tried the 14 bit yet, but was thinking of giving it a whirl today.
Anyone know if there is any problem with CS3 and 14 bit files? Also is there a noticeable difference in the tone graduations, or is it just hype? Thanks Clay:)
 

harmsr

Workshop Member
It really depends on the subject matter.

One big problem that you will have with the D300 is that it slow WAY down, when using the 14 bit. Forget high speed continuous shooting, unless you change back to 12 bit.

I personally found only a small bit of difference on the D300 in extreme shadows and highlights.

On the D3, what it gives me is more recoverability in the shadows and highlights. Yes, it is noticeable in larger size prints.

Best,

Ray
 
P

Panopeeper

Guest
There is considerable disagreement on this subject. The arguments against 14bit recording are

1. The additional bit depth does not carry useful information, except in extreme cases (like adjusting the exposure by five or more stops in raw processing). The shades between the 12- and 14bit values represent noise only.

2. The additional information can not be utilized due to the imitations of 8bit JPEG and of today's printers and monitors.

Point 1 can be discussed endlessly.

Point 2 is faulty, for

a. the current 8bit JPEG is on life support. It will exist for many decades to come because of the billions of existing images, but its active life is virtually at its end,

b. printers are already avaliable with more than 8bit support. Current high-resolution monitors have contrast ratios like 300:1 to 1000:1. Lower resolutions, like HDTV go higher already. New technologies are already available, though too expensive at the moment, with contrast ratio 10000:1 or more. These monitors will require more levels of shades, i.e. greater bit depth.

Storage should not be a big consideration, CF cards are cheap, hard disk space is dirt cheap. If the speed is a concern, then go 12bit, compressed, but losslessly. There is no excuse for lossy compression.
 

Jorgen Udvang

Subscriber Member
New technologies are already available, though too expensive at the moment, with contrast ratio 10000:1 or more. These monitors will require more levels of shades, i.e. greater bit depth.
My new 22" LG monitor, bought a couple of weeks ago for around 300 dollars, already claims to have a contrast ratio of 10,000 : 1. If it's a reality or just marketing speak, I don't know, but technology is moving at incredible speeds these days.

It's a very good monitor btw.
 

clay stewart

New member
Well, I did some comparisons today and they are really not worth posting. Basically, to my untrained (bit depth) eyes, it looks to be something like a quarter stop or less different exposure. So, maybe more will be revealed in the future. Thanks for all your inputs though.
 
P

Panopeeper

Guest
My new 22" LG monitor, bought a couple of weeks ago for around 300 dollars, already claims to have a contrast ratio of 10,000 : 1
Are you sure? This seems to be too high to today's standards, except if it has a very low resolution.

10,000:1 would mean, that everything coming from 8bit depth is posterized.
 

Jorgen Udvang

Subscriber Member
Are you sure? This seems to be too high to today's standards, except if it has a very low resolution.

10,000:1 would mean, that everything coming from 8bit depth is posterized.
Yes, I'm sure. When I started to look for a new monitor a couple of months ago, 3,000 : 1 was the standard. A few weeks later 5,000 : 1 and 6,000 : 1 started appearing, and when I was going to the shop to buy, suddenly this model was available.

It must be said though, that the high contrast is only available in what they call "movie mode", and I haven't read the instructions thoroughly enough yet to understand all the details.

Search for W2252TQ and you'll get a bunch of hits.
 
P

Panopeeper

Guest
the high contrast is only available in what they call "movie mode"
Ok, that explains the situation. I mentioned TV monitors above; that implies a different interface and lower resolution. Plus, with the constantly changing image the eye (rather the brain) does not have time to notice the effect, which would be apparent as posterization with still pictures.
 

Lars

Active member
I tried out 14-bit raw on my new D300. Had to dig waaaaay down in the shadows to find a difference. I'd say, for action shooting don't worry about it, whereas when fps is unimportant more precision doesn't hurt. But you won't see any difference unless you make aggressive post-shoot adjustments.
 
Top