Re: Execution time of code?
* Jeff Schwab:
James Kanze wrote:
historically, a lot of systems had clocks generated
from the mains, which meant a CLOCKS_PER_SEC of 50 (in Europe)
or 60 (in North America).
What's that got to do with clock frequency? (And why use the generator
frequency? Since we have triphase power, couldn't the grid be used to
generate 150 or 180 Hz signals?)
Hum, this is REALLY off-topic. But as I recall, in Windows the 'clock'
resolution has to do with ordinary Windows timer resolution which again, if I
recall this correctly, and I think I do, has to do with the wiring of the very
first IBM PC's timer chip, which as I recall had three timers on the chip, and
it was sort of 52 interrupts per second.
Let me check with gOOgle, just wait a moment...
Ah, not quite, it interrupted every 55 msec, that is about 18.2 times per
second. I remembered that about three channels correctly, though. :-)
And doesn't seem to be connected to Windows timer resolution after all, dang!
But while in this really off-topic mode, that search found useful article, <url:
On such systems, better precision simply wasn't available,
How so? Even the slowest processors I've ever seen had clock speeds on
the order of KHz. If you run slowly enough, weird stuff can happen;
capacitors leak voltage, and stored values flip.
I'm too lazy to check the value of CLOCKS_PER_SEC with Windows compilers.
Due to hosting requirements I need visits to [http://alfps.izfree.com/].
No ads, and there is some C++ stuff! :-) Just going there is good. Linking
to it is even better! Thanks in advance!