Re: forcing new to fail (or throw an exception)
On Jul 25, 6:11 pm, "H.S." <hs.saDELETEME...@gmail.com> wrote:
Victor Bazarov wrote:
H.S. wrote:
Here is a little question. I was reading up on the FAQ on pointers:
http://www.parashift.com/c++-faq-lite/freestore-mgmt.html#faq-16.6
and wanted to see what g++ (ver. 4.1.3) does if it cannot allocate
enough memory by trying to allocating huge amount. Here is what I am
trying: int main(){
double *ldP;
ldP = new double [2048*2048*2048];
Try
size_t s = 2048*2048*2048;
which generates:
$> g++ -o testmem testmem.cc
testmem.cc: In function ?int main()?:
testmem.cc:5: warning: overflow in implicit constant conversion
$> ./testmem
About to allocate 0 doubles
Curious that he didn't get the warning for his code. Or maybe
he didn't notice it. In fact, of course, according to the
standard, that shouldn't be a warning, but an error. (Strictly
speaking: the program is ill formed, and the compiler must issue
a diagnostic. Formally speaking, once the compiler has issued
the diagnostic, it can do whatever it likes, including reformat
your hard drive. From a quality of implementation point of
view, of course, either the program should not compile, or the
compiler should document this as an extension. In this case, at
any rate I'd definitly post a bug report to g++. Supposing 32
bit int's, of course.)
std::cout << "About to allocate " << s << " doubles" << std::endl;
double ldP = new double[s];
delete ldP;
Should be
delete[] ldP;
Thanks for the correction.
return 0;
}
It compiles okay. It runs okay too.
What am I missing here? How can I try to allocate memory huge enough
that new throws an exception?
Hard to say. Your program (due to wrong 'delete') had undefined
behaviour. Try fixing it.
Come now. We both know that the wrong delete wasn't the
problem. The problem was the overflow, which would have been
undefined behavior if the expression hadn't been a constant
expression.
So after removing my mistakes, and correcting the one in your code (sort
of), here is what throws the exception (this is on a Debian Testing
kernel, 2.6.21, since max memory allocation depends on the kernel
options(?)):
#include <iostream>
int main(){
double *ldP;
size_t s = 2048*2048*58;
std::cout << "About to allocate " << s << " doubles" << std::endl;
ldP = new double [s];
delete [] ldP;
return 0;
}
$> g++ -o testmem testmem.cc
$> ./testmem
About to allocate 243269632 doubles
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Aborted
You still haven't tested much. (I know, because operator new
doesn't work correctly with the default configuration of Linux.)
Try smaller blocks, and then accessing the allocated memory.
For some configurations, you'll get a core dump. (It may be
hard to simulate if you have a lot of memory.)
Basically, operator new can fail for three reasons: there's not
enough space available in the address space of the process (what
you're seeing, probably), the allocation would cause the process
to exceed some artificially imposed system limits (e.g. with
ulimits -m under Linux), or there really isn't enough virtual
memory. In its default configuration, Linux doesn't work in
this last case: operator new (based on what the OS told it) will
return an apparently valid pointer, which will cause a core dump
when dereferenced. (Older versions of AIX had a similar
problem, and Linux can be configured so that it behaves
correctly, too.)
Note that in this last case, at least some configurations of
some versions of Windows will pop-up a Window, asking you to
stop some other programs in order to make more memory available.
(I think some other configurations will just silently increase
the size of the swap space, and silently continue.)
--
James Kanze (GABI Software) email:james.kanze@gmail.com
Conseils en informatique orient=E9e objet/
Beratung in objektorientierter Datenverarbeitung
9 place S=E9mard, 78210 St.-Cyr-l'=C9cole, France, +33 (0)1 30 23 00 34