Re: error discription and solution
On 3/3/2014 11:39 AM, Joerg Meier wrote:
On Mon, 03 Mar 2014 10:17:07 -0600, Leif Roar Moldskred wrote:
Joerg Meier <joergmmeier@arcor.de> wrote:
That is an unfair and disgenuine argument. Nobody is suggesting people
don't LEARN how things work. You asked how to get something to work, not
how to learn about how the Java build process works. Going by your answer,
I would suggest you abandon Java completely, and instead focus on learning
Assembler, which of course you should only ever write using Notepad.
Obligatory link to the Story of Mel:
http://www.catb.org/jargon/html/story-of-mel.html
Every few years, someone links this, and every time, I read it again to my
great delight. As my life progresses, it's interesting for me to note how
my view of Mel changed over the years - from uncompromising and unreflected
hero worship when I was still young and a cowboy, to "I hope I never have
to fix code like that" when I started working on multi-person projects for
the first time, to "I hope I never have a coworker like that", ending with
"I hope I never have an employee like that".
Mel is, at once, the best and the worst aspects of a clever programmer.
Mel wrought his wonders when the economics of computing were
quite the opposite of those that prevail today. CPU's were slow,
so CPU cycles were in short supply and therefore expensive. Memories
were small, and also expensive (I toured a factory once and saw core
memory being assembled by hand, one ferrite doughnut at a time).
Programmers, on the other hand were cheap and plentiful, "plentiful"
in the sense that a small number of programmers could produce enough
code to keep the small, slow machines running at full capacity; more
programmers would only have increased the backlogs.
Today it's the other way around. CPU cycles are so plentiful and
cheap that we routinely waste them on things like screen "savers."
Memories are so big that we cheerfully spend sixteen bytes or more
on a `Double' in preference to an eight-byte `double'. And as the
demand for programmers' output has grown, the programmers themselves
have become expensive. The foot is in the other shoe: Instead of
spending programmer time lavishly to rewrite a program for a new
machine, we strain every nerve to amortize development effort over
multiple hardware generations. Damn the inefficiencies, full
speed ahead!
Mel wrote clever code because that's the only code that would
work: The program had to fit in the tiny memory, and it had to finish
before the sponsor's CPU funds ran out. Today, you and I use words
like "asymptotically" and studiously avoid being clever. Guess what?
The fact that we *are* right doesn't mean Mel *was* wrong; the fact
that he *was* right doesn't mean we *are* wrong. We work in radically
different circumstances; it's not too much of a stretch to say we work
in radically different industries.
The interesting question, I think, is What Happens Tomorrow?
Will quantum computers or thought-directed computing or positronic
brains change things as fundamentally as they have changed since
Mel? Will the Best Practices we promote today be dismissed as just
so much primitive bunkum? Naturally, I don't know -- but I wouldn't
bet against it ...
--
Eric Sosman
esosman@comcast-dot-net.invalid