I didn’t see this thread until now for some reason.
The previous answers are wrong: the third generation recorders (stationary recorders DCC-951, DCC-730, portable DCC-130, DCC-170 and DCC-175, mini-system FW-68, unreleased DCC-771 double-deck) all use the SAA2003 / SAA2013 chips for PASC encoding/decoding, which are capable of handling 18 bits on their input and output. They also have ADC and DAC chips that are 18-bit compatible, and digital I/O chips that are 18 bit compatible such as the TDA1315.
So if you record a tape with a third gen recorder, the PASC data is based on 18 bit samples, so you could say it’s 4 times as accurate. Obviously, if your source is a CD player, the lowest two bits of the 18 bit input samples are always zero because CD only has 16 bits of resolution. But an 18 bit (or more) SPDIF source should result in a recording of 18 bits of audio.
When using one of those 18-bit decks to play a tape, the PASC decoders will decode the data on tape to an 18-bit signal too. Most likely, they will always generate an 18 bit output signal even for cassettes that were recorded from a CD, and even for cassettes that were recorded with 16 bit recorders.
In those cases (where a tape recorded from 16 bit source and/or recorded on a 16 bit recorder), the lowest bits will just contain noise (but don’t worry: the two least significant bits add less than -97dB of signal). For 18-bit recordings played on an 18-bit machine, the output (digital or analog) should be a psychoacoustically accurate representation of the input.