Re: Article on possible improvements to C++
"sfuerst" <svfuerst@gmail.com>
I think you've missed the point. In other languages, people are able
to globally override memory allocation. This is done for debugging:
To find a memory leak, you can store the file and line of the
allocation call, and see where the allocations that are never freed
come from. Another use is profiling: What sized allocations are your
program repeatedly allocating and freeing? Perhaps the use of some
other allocator other than the default could improve performance.
With C++, you can use the placement new syntax to pass extra
information to an overridden new routine. The problem is that there
is no way to globally cause this to happen, without manually editing
every single call to new in your program.
<<
TMK there is a bunch of leak detectors out there.
But I never used any of them -- Visual C++ has it built-in for decades. And
all it takes a single #define new debug_new at the front of the source. If
you use it, then you get the source file/line of allocation besides the
leaked block address, size and content.
The last time I had to put together a ehap diagnostic was around '94, even
then I was possibly just not aware of a ready solution...
Manually editing lines of code?
What you ask for, is sitteng there, working, discover how it is done instead
of claiming it impossible.
OTOH, the real way to make correct code is definitely not going by that info
but through using consistent RAII-like handling, and reviews enforcing it.
As test runs will hardly cover all possible paths including errors and
exceptions, so relying on the empty leak list from a random run is nothing
but illusion of being okay. While with trivial style it is easy to make
leaks impossible.
Why invest in better patches instead of cure the problem at roots?
(You can override
individual classes, no problem... but the only available global
overrides are operator new(size_t) and operator new[](size_t), and
these don't let you access the file and line number you need.)
If new was some sort of operator instead of a keyword, and used
brackets to surround the type, then it would be possible to use a
macro to do this. Unfortunately, this isn't the case.
i.e We'd like to do something like:
#ifdef DEBUG_ALLOCATIONS
#define new(T) new(T, __FILE__, __line__)
#endif
Imagine if new was a template... (convert it from a keyword into
something within the standard library),
and then imagine if templates used normal brackets instead of greater
than and less than signs. These "tiny" changes increase the
orthogonality enough that the above becomes possible. Unfortunately,
macros require normal brackets, which is why both changes are
required, rather than just the first.
....
You don't use delete[] with a vector... (Unless you have an array of
vectors. :-P )
You should not use delete[] at all, ever. Sending all the related problems
to limbo. (this one is even easy to "review" by grep...)
Of course, I know this is just a nice way of saying that most people
shouldn't use raw arrays. They are out of date, and vectors should be
used instead.
vectors, or other fit collections -- they are easy to find ready to use. And
even writing one is simpler than dealing with the related problems. I'm
sure there is no use case where built-in array would beat them with one
dimension. With multi dim see the relevant chapter in Wilson.
The problem with delete[] is that it exists.
So eradicate it for good. How much of real problem is THAT really?
Especially compared to the can of worms those raw arrays bring, especially
paired with dynamic allocation...