Re: Exception Misconceptions
In article <email@example.com>, "io_x" <firstname.lastname@example.org> wrote:
"James Kanze" <email@example.com> ha scritto nel messaggio
On Dec 19, 4:51 pm, "io_x" <a...@b.c.invalid> wrote:
"tanix" <ta...@mongo.net> ha scritto nel
Kanze <james.ka...@gmail.com> wrote:
Wasting weeks on cleaning up the memory leaks?
I have wasted MONTHS on trying to catch all the subtle
memory leaks in a sophisticated async based program because
of some network availability issues make you maintain all
sorts of queues, depending on user interaction, causing such
headaches, that you can not even begin to imagine from the
standpoint of memory leaks.
do you know, exist wrapper for malloc, that at end of the
program check if there are "memory leak" and report the result
to the screen
If the program ends, it doesn't leak memory, and practically,
there's no way any tool can tell whether there is a leak or not.
Well, flip some bits in Visual Studio IDE and you'll get
a dump of ALL your memory leaks when the program terminates.
Not exactly a nuclear science. But some things are not as simple
as it looks. Sure, if you are willing to waste days on it,
assuming those leaks are bad enough to make you worrry about it,
you can design some things to exactly identify the situation.
But there are times when you pass the same buffer around
to several methods and hold a reference to in on various queues
for efficiency reasons for example. In those cases, your buffer
may be releasable by several different routines and in the async
environment, you don't have a luxury of seeing a sequential
allocation and deallocation within more or less same scope of
You may allocate it in one place. Then, depending on what kinds
of things happen, deallocate it DAYS later for that matter,
and from SEVERAL different places. Not just one, and I could
care less what anybody has to say on "good design" issue.
So, considering the fact there are several clients, which one has
Yes, memory deallocations are non issues in trivial applications.
But if you do not have a gc equvalent mechanism, no matter how
you cut it, it is going to be a headache one way or another.
if there is "no way any tool can tell whether there is a leak or not"
Well, there ARE ALL sorts of tools and methods.
It is a matter of "bang for a buck", or "return on investment",
a matter of economics and priorities.
How long is it going to take you to even get into it?
Can you afford to spend DAYS?
Well, depends on the size of the stack on your table
and relative priorities.
Yes, it is nice to hear there is such a thing as a "good design"
paradise. But that is like politicians telling you and
promising you ALL sorts of things you never seem to get at
There is this thing called reality.
that programming language has one all you call "designed error"
because can eat memory more than is sufficient
because all you can not have the controll on memory
(Well, they can tell that some things are definitively leaked.
If there are no pointers to the memory, for example, it has
leaked.) What the tools do is suggest possible leaks.
With regards to the first statement, of course: I've worked on a
fairly large number of applications which didn't leak.
so in the first statemt you say:
"there's no way any tool can tell whether there is a leak or not"
in the other you say
"I've worked on a
fairly large number of applications which didn't leak."
so do how you can be sure?
if pass some of your programs with some "leak print at end" program
possibily find some leak (or you OS-compiler allow to see if one program
has some leak (not free all memory) )
Unfortunately, if you do a raw malloc() type of calls and not
new(), what you get in the dump of your memory leaks at the end
of execution is something that does not tell you WHO did the
allocation, so you have no idea who MUST have done DE allocation.
And if you have all sorts of objects allocated on the heap,
the whole things becomes a nightmare.
So, I resorted to conditional compilation in some situations
and sacrifice some efficiency for the sake of debugging.
And there are tradeoffs in those "design decisions".
You do want the performance.
You do want the efficiency of resource useage.
You do want robustness.
You do want compacntess.
And you do want ALL sorts of other things.
The question is how do you have it all at once?
The answer: ONLY in paradise!
And not on the pysical domain on the planet Earth.
for my little programs in my enviroment i'm sure for the memory
of the malloc wrapper (not for the memory released from "new" (i never use
it but should be used from the compiler or for static object if
i rember well))
Well, you can have an alloc wrapper.
What I did in my last version for one of the most critical
and massive allocation related methods is to ID stamp the
allocations. The buffers are allocated at a single place,
a driver interface. Once they are allocated, they are stamped
with ID of 0.
Once you start processing and pass those buffers around
and incorporate them into different objects, you increase the
Once you terminate the program and get your memory leak dump
from VC, the 1st byte of buffer is that ID. So you can see
exactly what routines had that buffer and could or should have
released it. That took care of some of the nastiest deallocation
issues I had that could potentially lead to MASSIVE leaks
under certain conditions.
them, you may have used, without knowing it, since many of the
programs I've worked on aren't visible to the user---they do
things like routing your telephone call to the correct
destination. (And the proof that there isn't a leak: the
program has run over five years without running out of memory.)
possibily the leak is small and/or in some corner case
Even considering 5 YEARS?
so memory leak can not be one problem if one use these special
"malloc" functions (like i use always with all bell that
I'll say it again: there's no silver bullet.
the silver bullet exist in all
In the end, good
software engineering is the only solution. Thus, for example, I
prefer using garbage collection
when I can, but it's a tool
which reduces my workload, not something which miraculously
elimninates all memory leaks (and some of the early Java
applications were noted for leaking, fast and furious).
are you sure is it good for think less?
are you sure is it good for not doing formal
correct memory allocation-deallocations?
Programmer's Goldmine collections:
Tens of thousands of code examples and expert discussions on
organized by major topics of language, tools, methods, techniques.