The above is pretty close to what I have heard as general guidelines. It does depend on the turbo, bearings etc. If you want to get more accurate than that I would try and contact the manufacturer.
Depends on the size of the turbo, the size of the bearings and shaft, and type of bearings (full floating hydrodynamic or rolling element).
A turbo for a (relatively) large CAT 3612 may have greater axial and radial play specs that a turbo for an 800cc smart car diesel.
For typical hydrodynamic journal and thrust bearing systems, and assuming that you are thinking typical pass car engine size, I would guess 0.001 to 0.003 axial play is OK (units are inches), and 0.001 to 0.004 radial play is OK. True radial play is difficult to measure without the correct probe. Accurate radial play should be measured as shaft excursion between the bearings when manually displacing both the compressor and turbine ends of the rotating group in the same direction.
"Axial tolerance can never be zero, because there absolutely must be allowance for thermal expansion"
Agree it can't be zero - the oil film has to fit somewhere. It is possible to design a thrust bearing to maintain constant clearance under varying temperatures.
Engineering is the art of creating things you need, from things you can get.
I recall many years ago a case of incorrect axial preload on a ball bearing center section re-designed by a mutual acquaintance of ours (RH). It was for the Nissan '90 GTP race car.
The spool had excessive axial play. Every time the driver shifted or got off the throttle, the change in pressure force between the compressor and turbine would cause the spool to momentarily shift axially and upset the piston ring seal. A small amount of oil would leak into the turbine housing and create a big puff of exhaust smoke.