D3x rumors have been around for years, perhaps we'll finally see something this year.
What 12 bits translates to in a final image depends on several factors
- The inherent noise in the analog section of the chip
- The camera manufacturer's decision on where to place full exposure
- The interpolation algorithm of the Bayer interpolator in the raw converter
- Exposure and curve adjustments after conversion to an image raster.
Also keep in mind that this is linear bits - 14 stops dynamic range fits into an 8-bit gamma 2.2 space, it's not like the 8-bit limit clips half the information ( though some precision is lost).
I think the only thing that can safely be said is:
Everything else being equal, higher bit depth is less limiting on image quality.
However, everything else is never exactly the same, which is why we wait for the reviews to see the actual performance once the chip is in a camera. Until then, a theoretical discussion around chit bit depth is just theoretical.
There is a parallel to the same discussion for scanners - however that is a different situation: scanner bit depth is most important when scanning negs, as negs only use a limited center section of the dynamic range. This limited range is then inversed and expanded, and high bit depth will mean smoother tonality. Since most of us don't use our DSLR's to shoot negs on a light table, that consideration is hardly applicable to cameras.