zl2k a ??crit :
On Oct 14, 12:19 am, Michael DOUBEZ <michael.dou...@free.fr> wrote:
Ian Collins a ??crit :
zl2k wrote:
Suppose I have integers
needs to store in a vector and I know the max number of integer is,
say, 1000. Will it be more efficient that I first allocate size of
1000 and then use {vector[counter] = aNumber; counter++} to populate
the vector than by push_back(aNumber)?
Probably.
You can use vector<>::reserve() instead. That way push_back() won't
cause reallocation and you can keep your vector size consistant with the
size of data held (and it doesn't initialize/destroy unnecessary data,
not that it matters with ints).
But if I use reserve(maxAmount), then the maxAmount of memory is
acturally occupied by the vector, no matter how many elements are
actually there. Is that right?
Yes but the result is the same with resize(). The standard does not
require that resize() give back memory and I assume that in most
implementation it doesn't.
It is one item of Herb Sutter's "Effective C++". There is a way that
potentialy allows you to get back the memory:
vector<int> myVector;//lot of memory reserved relatively to actual size
{vector<int> tmp(myVector);
myVector.swap(tmp);
}
//here there is a chance to have myVector.capacity() reduced
Another possibility is that vector isn't really the container you need
or you can use another container to build the data and then copy them in
a vector.
Michael
Thanks for all the very helpful replies. One more question: