Re: Hungarian Notation and Java
On Jul 14, 9:45 am, Lothar Kimmeringer <news200...@kimmeringer.de>
wrote:
Steve wrote:
Is there any reason left to use Hungarian notation with Java?
I asked that myself - 10 years ago - and the first time I changed
the type of a variable from int to double I answered it to myself
with a simple: no
Regards, Lothar
--
Lothar Kimmeringer E-Mail: spamf...@kimmer=
inger.de
PGP-encrypted mails preferred (Key-ID: 0x8=
BC3CD81)
Always remember: The answer is forty-two, there can only be wrong
questions!
Same here, only Java wasn't in the marketplace yet, and I didn't wait
to try it to conceive of that scenario.
It's a fundamental mistake to name anything in terms of implementation
type rather than domain type. For example, a bunch of invoices stored
as a 'List<String>' named 'slInvoices' is obscure, uninformative and
trenchantly stupid since the 'sl' tells nothing about the role of the
variable in the algorithm. A much better name is 'invoices', and it
can be a 'List<String>' or 'Set<Invoice>' as the implementation
demands. (The latter most likely is better.) The name reflects its
purpose, not its irrelevant implementation type.
One of the core principles of object-oriented programming is to *hide*
implementation. Hungarian notation (as commonly practiced) is the
stupid antithesis of that.
This is a very, very old debate and the winner was decided a looong
time ago.
<http://lmgtfy.com/?q=Hungarian+notation+usefulness>
--
Lew