I'm a software engineer. When I got my a900, I did a bit of looking at the dcraw software to see how CRAW is handled before deciding which raw format I wanted to use.
I don't remember all the details, but basically, CRAW records the full 12-bit value of one pixel, then uses a 8-bit offset from that value for the following group of pixels in the same line (the group of pixels is a small number, maybe 16 pixels) at which point it records another full 12-bit value. Since systems involving lenses and AA filters and sensors don't have infinite contrast, it's rare, if even possible, that 8 bits isn't enough to record the difference between one pixel and the one next to it. In any event, it gets corrected quickly.
After looking at the code, I never used RAW, only CRAW with my a900. I don't think it loses anything visually, especially considering noise (you can't know exactly what a pixel actually should be anyway).
I don't know how 14-bit data is handled, but it may only be used internally by the imaging asic before the raw file is created. This would still reduce the effect of noise introduced during analog to digital conversion which is the real point of 14-bits anyway.