Re: Who gets higher salary a Java Programmer or a C++ Programmer?
On Nov 25, 12:06 am, "Peter Duniho" <NpOeStPe...@nnowslpianmk.com>
wrote:
On Mon, 24 Nov 2008 14:14:01 -0800, James Kanze
<james.ka...@gmail.com> wrote:
Strictly speaking it's not a leak but a plug - memory is
held by the program, not leaked from it. It is called a
"memory leak" in Java, but similarly to the term
"reference", its meaning in Java is not the same as in
another language like C++.
In no sense of the word is it a leak.
Just goes to show, you can have this argument in any
programming newsgroup, with or without a garbage collector.
:)
A leak isn't a couple of drops spilling over the edge; it is
a continuous loss.
I have never considered "continuous loss" (that is,
continually increasing over time) to be part of the criteria
for defining a "leak". Even a single block of data for which
no reference remains and so making the block of data
unreachable is a "leak" in my book.
Well, you can define it anyway you want, but any definition not
implying continuously increasing memory use has no practical
implications. And of course, in non technical English, a leak
is also more or less permanent: you don't say a bucket leaks
because a few drops spill over the top; you would only speak of
a leak if there was a continuous loss.
But by your definition, his code didn't leak either, so I'm not
sure what you're arguing about. I'm aware of this definition,
and considered it in my "no sense of the word".
You may feel free to disagree, but I find it pointless for you
to write something like "in no sense of the word".
OK. "In no reasonable sense of the word", then. Or "in no
practically usable sense of the word." Or "in no sense of the
word I've ever seen."
There are many "senses of the word" when it comes to the word
"leak", and lots of people uses sense of the word that
support Lew's and my interpretation, not yours.
The actual example code didn't leak, in any sense of the word.
It corresponded to an established and widely used pattern.
Actually, on looking at the code closer, I think it is a leak.
I'd missed the fact that each constructor added the object to
the list. He has, effectively, created a situation where
objects of type Foo can never be recovered by garbage
collection; this is not fundamentally different from my
classical example of a class registering itself somewhere. So
either:
1) The program needs a record of all instances of Foo; once
created, an instance lives forever; and the program never
creates more than a bound finite set of Foo. In this case,
the code is perfectly correct; if the cardinality of the
bound finite set is 1, we even have the established idiom of
a singleton. (In this case, the constructor really should be
private.)
2) The program needs a record of all instances of Foo; once
created, an instance lives forever; and the number of
instances which may be created in not bound. In this case,
he has a real problem, since his application requires a
machine with infinite memory in order to run. He probably
needs to review the requirements.
Note that in some cases, this may be more or less
inevitable, and the requirements will end up having to be
formulated in terms of "within its resource limits, the
program will...", with a definition of the behavior when the
resource limits are exceeded. Consider the symbol table in
a compiler, for example. If the program is supposed to run
24 hours a day, 7 days a week, however, this could be a
killer problem.
3) The program only needs a record of the active instances of
Foo, and the programmer has forgotten to provide a means of
"deactivating" an instance (or user code has forgotten to
call it). In this case, the code does leak. At least by my
"useful" definition.
[...]
As far as this unimportant disagreement goes, here's my
stance: garbage-collected systems cannot have true "leaks",
except for a bug in the GC itself. On the other hand,
imperative memory management systems, such as C++'s
"malloc/free/new/delete" can. And the way those "leaks"
happen is that you _do_ remove the "identifier" (i.e. the
pointer/reference to the memory) without calling the
appropriate function to actually release the memory block
back to the memory manager.
That's a nice definition for commercial purposes. It sounds
nice to be able to say that your language cannot have a leak (or
that your product detects all leaks). But it's really is a sort
of commercial new-speak to redefine "leak" in order to be able
to say that. It's sort of like Java (and C++ for the unsigned
integral types) redefining arithmetic "overflow", in order to
say that integral arithmetic can't overflow. With the
difference that there are a few exotic cases (at least with
unsigned values) where the new definition is definitely useful
(and even in the case of Java, it allows "defined" behavior at
no cost on most machines---and even incorrect defined behavior
is better than undefined behavior).
--
James Kanze (GABI Software) email:james.kanze@gmail.com
Conseils en informatique orient=E9e objet/
Beratung in objektorientierter Datenverarbeitung
9 place S=E9mard, 78210 St.-Cyr-l'=C9cole, France, +33 (0)1 30 23 00 34