Display count resolution for any digital readout instrument is always plus or minus 1 least significant digit. (This is because you obviously can't display a fraction of a digit in this position - not to be confused with "half-digit" displays where the most significant digit is either blank or 1, e.g. 3 and half digit display can show a maximum of 1.999, 19.99, etc.)
The accuracy is primarily determined by the measurement system which drives the digital display (e.g. how many bits in the A to D converter)
'1999 count' is the same thing with the overflow bit wired up to the leading one digit. This is a very common trick to double the overall range. It makes slightly less sense today, but it is still very common.
It is possible to keep the display within about one-half of the least significant digit, but most designs don't bother to track the next LS digit and then not display it.
Usually the limiting factor is the ADC. The processing and display are trivial in comparison.