Just in case it wasn't clear it wasn't clear from the (very informative) powerpoint presentation on telechron clocks, frequency time is derived by assuming that each oscillation of the grid is equal to exactly 1/60th of a second (i.e., the system frequency is exactly 60 Hz). Once you set a clock that is using system frequency, the hands on the clock just keep turning at a rate that is proportional to the system frequency, and so long as the system frequency stays that exactly 60 Hz, the clock will turn at the correct rate. If the system frequency is a little fast, then the clock will turn a little faster.
The utility that I work for is isolated from the North American grid, and as such we are responsible for regulating the system frequency in our region. In our control centre, we have a display showing system frequency, "system" time (derived from system frequency) and GPS time (derived from a GPS signal, and presumably accurate), along with the difference between the two. Our operators regulate the system frequency so that it stays within reasonable limits, but they will run it slightly above 60 Hz if the system time is falling behind the GPS time, and slightly below 60 Hz if it is getting ahead.
In two of the regions that we sometimes island, we have duplicates of this clock so that the local operator can regulate the system frequency while not connected to the larger grid. In a third region, one of the local operators keeps a battery-powered clock that he uses when disconnected from the larger grid (accuracy isn't great

).
A couple of years ago, the control centre clock developed a slight error, and over a weekend we "lost" about 10-15 minutes - oops!