Re: program is not crashing, after 10 iteration
On Jul 18, 10:10 pm, "Alf P. Steinbach" <al...@start.no> wrote:
* Juha Nieminen:
Andy Champ wrote:
I'd say to anyone stay away from malloc. It's dangerous.
Use the STL stuff to manage memory, it's much safer.
And if you can come up with a good reason why to use malloc
you probably know enough to know when to break the rule.
It occurs to me that _I_ haven't use malloc all year.
A related question: If you really need to allocate
uninitialized memory for whatever reason (eg. you are
writing your own memory allocator, some kind of memory pool,
or other such low-level thing), is there any practical
difference between using std::malloc() and ::new?
The latter is compatible with your app's overall other error
handling.
Note that when dealing with raw memory, I prefer ::operator
new(n) to ::new char[n]. IMHO, it expresses the intent better.
How to deal with memory exhaustion is a difficult topic,
though.
Except when it's easy:-).
For small allocations you want to rely on a terminating
new-handler or other scheme that terminates, instead of the
default std::bad_alloc exception.
Which makes it easy:-). (And a lot of applications can use this
strategy.)
But say you're loading a large picture and that allocation
might fail. In that case your app is usually /not/ hopelessly
screwed if the allocation fails, so in that case you may want
the exception, and just report load failure to the user.
Or you might want to change strategies, spilling part of the
data to disk, in which case, you'd use new(nothrow).
The problem here is that when it "just fits", you might still
end up using so much memory that the machine starts thrashing.
This is one of the cases where memory exhaustion is a difficult
topic.
What's difficult is what to do for the terminating case.
Do you just log (if you have logging) and terminate, or do you
let your app have a go at cleaning up, via some "hard
exception" scheme with stack unwinding up to the top? The
problem with the unwinding is that if the failure is caused by
a small allocation, or e.g. if there's a memory-gobbling
thread or process around, then even just making an attempt at
cleanup might make matters worse, e.g. one might end up with
hanging program and no log entry. Debates about this have been
endless with no firm conclusion, only that some people find
one or the other idea horrible and signifying the utter
incompetence and perhaps even lack of basic intellect of those
arguing the other view. :-)
There's also the problem that the logging mechanism might try to
allocate (which will fail).
One strategy that I've seen used with success (although I don't
think there's any hard guarantee) is to "pre-allocate" a couple
of KB (or maybe an MB today) up front---the new handler then
frees this before starting the log and abort procedure.
Should one be preferred over the other?
Yes. ;-)
--
James Kanze (GABI Software) email:james.kanze@gmail.com
Conseils en informatique orient=E9e objet/
Beratung in objektorientierter Datenverarbeitung
9 place S=E9mard, 78210 St.-Cyr-l'=C9cole, France, +33 (0)1 30 23 00 34