Strange "framerate drop" using with my own timer class
I wrote a small timer class for use in a graphics engine I am working on
for teaching myself.
The small time based animations I tried with it seem to work fine but
displaying the frame rate I noticed a certain "drop in framerate" to
appear at regular intervals without any apparent reason.
Here's the code I'm using:
class BasicTimer
{
public:
void update();
float getTimePerFrame() const;
BasicTimer();
~BasicTimer();
private:
// Copy constructor and assignment operator
// Declared private for copy protection
BasicTimer(const BasicTimer& rFrom);
BasicTimer& operator=(const BasicTimer& rFrom);
// Attributes
float timeOfThisFrame_;
float timePerFrame_; // In seconds
float timeOfLastFrame_;
};
inline BasicTimer::BasicTimer()
: timeOfThisFrame_(0.0f),
timePerFrame_(0.0167f),
timeOfLastFrame_(static_cast<float>(clock()) /
static_cast<float>(CLOCKS_PER_SEC))
{
}
inline BasicTimer::~BasicTimer()
{
}
inline void BasicTimer::update()
{
timeOfThisFrame_ = static_cast<float>(clock()) /
static_cast<float>(CLOCKS_PER_SEC);
// Prevent overflow (however unlikely considering the data range)
if (timeOfThisFrame_ - timeOfLastFrame_ >= 0.0f)
{
timePerFrame_ = timeOfThisFrame_ - timeOfLastFrame_;
}
timeOfLastFrame_ = timeOfThisFrame_;
}
inline float BasicTimer::getTimePerFrame() const
{
return timePerFrame_;
}
I display the framerate in my main() by calling:
std::cout << (1.0f / timer.getTimePerFrame()) << std::endl;
and I get either 1.INF usually since I am only drawing two triangles
right now, or some value around 60.0 if I activate VSynch on my GPU's
driver.
Anyway, every 10 frames or so that framrate drops. Either from 1.INF to
60, or from 60.0 to 30.0 depending on what mode I am on.
It's really happening at somewhat regular intervals but I don't see
what's causing it. The animation seems to be smooth enough so the values
seem to be correct, but why do some frames at regular intervals take
longer to render?