Re: auto_ptr vs. boost shared_ptr
In article <1153217452.851018.186990@s13g2000cwa.googlegroups.com>,
kanze <kanze@gabi-soft.fr> wrote:
Carl Barron wrote:
James Kanze <kanze.james@neuf.fr> wrote:
vector< shared_ptr< int > > v1; // OK.
vector< auto_ptr< int > > v2; // Undefined Behavior, but
probably a
compile error.
True, but most of the time, I find that raw pointers are
best here. Of course, I usually use the Boehm collector, so
I don't need a surrogate for garbage collection.
Does your GC handle things like
struct larger_than_10
{
bool operator () (int *p) {return *p > 10;}
};
struct create_ptr()
{
int i;
create_ptr():i(1){};
int * operator () () {return new int(i++);}
};
struct kill
{
void operator () (int *x) {delete x;}
};
int main()
{
std::vector<int *> data;
std::generate_n(std::back_inserter(data),1000,create_ptr());
std::vector<int *>::iterator last =
std::remove_if
(
data.begin(),
data.end(),
larger_than_10()
);
std::for_each(last,data.end(),kill());
data.erase(last,data.end());
}
remove_if blindly overwrites the int *'s so the deleted ones are
either
never deleted until the os takes over, or are deleted mutliple times.
There can be a problem in that data.erase() doesn't actually
free memory. It calls the "destructor" on the freed elements,
and a destructor on a pointer is a no-op. The results are that
the memory corresponding to any pointers between last and
data.end() will not be freed until the vector itself is
destroyed.
To date, this has not been a problem in my code; my vectors tend
to grow, and never shrink. But it's not hard to imagine cases
were it could be a problem. Somewhere on my to do list are the
modifications in the g++ implementation of the standard library
to ensure that pointers in conceptually raw memory are
effectively nulled.
Since std::vector is not required and usually does not actually
release the memory from an erase but keeps it in the reserved space
then there is no problem here. It is the deletion of the contained ptr
that was my concerned. Vector normally shrink[end-begin decreases]
but capacity is not reduced until the vector is destructed Just
wondering about pitfalls and <algorithm>.
I definitely do not recommend returning any access of a vector of raw
ptrs to the user, A class that is reasonably careful can use ptrs. I
also note anyone using an stl container should be wary of using
<algorithm> and assuming no problems can occur. <algorithm> is 'value
based' and usage of a modifying algorithm is a possible hidden bug, not
that vector <T *> is unusable.
perhaps recipe for disaster is a liitle too strong. But if I hardly
ever use a vector of raw ptrs because of memory management problems
I don't want to consider.
At least your GC does not seem to cause problems with <algorithm>.
[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]