Optimizing a function?
I have a function that gets called for 512*512*350 times. In this function
there is a loop that gets run each time the function is called:
myFun(VectorType point) {
std::vector<MyObj> & myobjs =this->FindMyObjs(point);
int num = myobjs.size();
for (int i=0; i<num; i++) {
// Do stuff with each neighbor
}
}
Where FindMyObjs(point) is a function that looks up the point in:
std::map<int, std::vector<MyObj>> m_myMap;
which has been created before the program is run:
std::vector<MyObj> FindMyObjs(VectorType point) {
int key = computeKey(point);
return m_myMap[key];
}
The computeKey(point) looks like this:
unsigned int computeKey(VectorType location ) {
unsigned int key = 0;
int x_dir = floor((location[0]/m_cellSize[0]));
int y_dir = floor((location[1]/m_cellSize[1]))*m_split;
int z_dir = floor((location[2]/m_cellSize[2]))*m_split*m_split;
key = x_dir + y_dir + z_dir;
return key;
}
The number of elements in the std::vector<MyObj> is fixed. When the vectors
contains 216 elements the program takes around 50 seconds to complete (on a
3 GHZ machine with 6GB RAM and code compiled for release, using Visual
Studio 2008).
I have tried to remove the code from the while body but is has almost no
effect on the computation time.
Am I missing something very basic c++ optimization skills there or is the
program not meant to run faster on this machine?
I have made sure that only references to the already existing data
structures are used. Could it be the computeKey (3 floor operations) that
is expensive when called so many times?