Re: Microsoft chooses to leave C++ compiler broken
"James Kanze" <james.kanze@gmail.com>
These errors are in the "nuisance" category. Compile fails,
so you're forced some workaround. The one OP mentioned is in
"danger" category -- the code compiles silently with incorrect
code generation.
All compilers I know of have some bugs in that category as well.
It's true that they generally agree to fix them as soon as
possible when they are pointed out (for some definition of "as
soon as possible").
What is quite important approach-wise. We know very well that writing
non-trivial applications has too many opportunities to leave some issues.
Een with all teh best effort and massive resources. Some shit still can
happen. If a QA process is honestly followed I do not really blame
issuers for having residual defects. That was not discovered until reported
from the field.
But once in the open, I do expect them taken seriously, and getting fixed.
In how much tme is a gray area, but we discuss a simple refusal, based on
quesstimate of likelyness.
Since I'm apparently at the source of this, I'd like to
relativize it somewhat:
-- The bug takes a very particular combination of conditions to
appear; VC++ doesn't just drop destructors at random.
-- IMHO, in most application domains, that combination of
conditions simply won't appear, unless you're coding so
badly that nothing is going to work anyway.
Interesting, the code example I saw didn't look too special, or even
unaesthetic. I could have some like code in my programs somewhere, doing a
reach in a collection and return some processed result. Having no other
return jsut a throw or assert. And how it uses temporaries it not a review
point if the result is correct.
On the other hand:
-- There are a few domains, particularly where complex
numerical analysis is involved (but possibly others as well)
where it can reasonably appear; it won't ever appear in code
I write, because my style (usually SESE) will never create
the particular combination of conditions, but the code where
I did find it wasn't unreasonable, and
-- It cost me three weeks to track down, three weeks of my
time, paid for by my employer (which means that a new
feature which we should currently be testing hasn't been
implemented yet).
My take is pretty simple on "rare" bugs. My software possibly have them,
but if no one ever is hit by it, then it is really like nonexistance. If it
is found, that one occourance indicate it is not rare enough. I may
speculate on chances to got hit (more as part of thinking why it was not
discovered in-house). But it is already over the threshold.
It is, obviously, that last point which made me bring the issue
up to begin with. I don't care if it's one in a million, if I
happen to be that one. (Similarly, I don't care if its 999999
in a million, if I'm the one exception.)
Yeah. Like stated in one of the Discworld books -- the one-in-million
chances tend to hit you 9 times of ten. ;-)))
But seriously too, I just recently found a race condition in my current
thing. My estimate for the chance is 1 to 16 million. (That is pretty
close to winning top on lottery). I learned about it from a field report,
being executed just a few hundred times.
Anyway, I can understand Microsoft's original decision, even if
I don't agree with it, and I'm thankful for Herb's intervention
now.
Well, unfortunatley it is too easy to understand all kinds of "no-fix"
decisions. And speculating or making up likeliness figures is a good
wildcard in the rationale to file away the work. While pressing on for
resources to issue a fix is hard -- unless you are the actual boss and want
it.
It does not make it either right or even necessarily the more ecomonic
solution, even taking just the company. Let alone calculating with the
real pain of anyone suffering the consequences.
Guess everyone around here agrees the perfect symmetry of
ctor/dtor calls is fundamental in C++ and any code we write
just takes it as granted.
Only the younger generation:-). I can remember a time when
almost all compilers I used had some problems in this regard.
Well, I remember them, but thought we're over the issue in the late 90s.
(G++ 1.49 generated the destructor calls at the end of the
block. Immediately behind the ret instruction it generated if
there was a return statement in the block:-).)
Yeah, we had many kind of issues but I can't recall a single issue denying
it a clear bug and that it should be fixed. So that problem belongs to
the "we know what we want, just hard to actually get it" part.
1.49? Was it 15-20 years ago?
Seems folks at M$ have different thoughts and just plant
breaking bug in the optimizer -- ant then refuse to correct
it. In 2 years, then decide to not fix it at all. To me that
sounds outrageous and is ground to drop using MS compilers.
If you refuse to use a compiler with any bugs, then you won't be
doing much C++.
The point of the statement is not having accidental bugs, but refusal to
fix.
O, Murphy can't be ignored, so bugs may be in the release, but
I expect them gone after discovery. Suggesting "workaround"
for the few weeks is okay, but thinking actual programmers
shall stop using locals in a loop -- or start to count returns
in functions? Hell no.
First, it only occurs if you return the local. Conditionally.
And there is no other return statement in the function. I don't
think that that's a common case. It does occur, obviously
(since I encountered it), but I don't think it would ever occur
naturally in code implementing, say, a compiler. Which
doubtlessly explains why the compiler team at Microsoft thought
it was rare enough to be ignored.
I think it is pretty clear what the thinking process was. My point is that
that thinking process is fundamentally flawed (IMNSHO), and should be
eradicated.
(On a side track it would also be interesting to see what would be
considered "common enough" to warrant a fix and how the thing is measured.
My experience is damn sour in this area, the evaluator really thinking
the other way around, making a decision to not fix by some resource
availability or mood, then make up some verbal explanation to support
at. )
[...]
Maybe a petition signed by many ACCU members (and other C++
users) could create some force.
In the case of large companies, like Microsoft and Sun, all it
takes is a big enough customer. Presumably, no large customer
had complained.
If Herb didn't know me, and intervene
personally in the problem, they probably wouldn't have reacted.
My idea of asking ACCU was on similar line -- if high gurus of C++ think it
is Bad Thing (TM) the force is supposedly similar to a "big vendor". After
all who makes the decision to use some language and which compiler.
Unfortunately I didn't see much activity -- petition to save the Bletchery
Park had ways more traffic. Too bad.
I still believe it is worth some pushing, and Herb -- being already on the
issue may have some munition to press improvements. When I was fighting
the "machine" being able to show "demand" was help.
(On the other hand, if the work-around hadn't been so simple, my
bosses might have had purchasing intervene, and we're probably
a big enough customer to get some reaction.)
Gosh, the work-around looks simple once you spent your 3 weeks to locate the
ill spot in your code. At that one spot. Wha tabout the other places and
future? Make it a bullet in code reviews to count exits, conditionals and
temporaries in functions, to spot another possible candidate? To me
even the idea sounds ridiculous.
Wha other people figured: turn off optimization makes more sense, with
statting to look for a different compiler.
It is a TRUST issue. If I am not sure my tools are up to the task, they are
not fit. It is hard enough to write a correct source, now add to it that
it will be mis-translated?
This isn't
particular to Microsoft---it's just the way things are (and I've
had similar experiences with Sun: when my one-man firm posted an
error, it was noted; when my customer, who was Sun's largest
account in Europe that the time, complained, one week later
there were three engineers from California on site to find out
what the problem was.)
Sure, the last part, big power induces fast and big effect is there. It
doesn't make necessary that less spectacular requests to be discarded.
Especially in an uber-blatant way: on ACCU list people said the discussed
issue is present in VS2005. a fix did not make it to the 05 service packs,
to the next major 08, to its SP1, then ro the next issue 10.
That is the real problem. Not issuing a specific hotfix tomrrow can be
rationalized somehow by that rare argument. But the issue would still wear
the "bug -- to be fixed" status, and not survive any forecoming next
milestone.
In the last decade, I several times wrote in differnt foruns
addressing the usual MS-bashing and claims about poor quality,
that it mostly roots in the 90's and the company IMO changed
for the right track around 2000 -- certainly the cleanup takes
time but the will is there, and resources are used for good.
Like most companies, they're mixed. If you use them, you take
the bad with the good.
Well, that "accepting bad" was a major force in early 90s. i guess MS made
insane amount of money on it, but also picked up all the bad reputation that
stick a decade after change of heart and will linger for much longer. (I do
remember the pre-win3x era when Microsoft was a brand of very good wality,
be it a C compiler, MS-Word, MS-dos 3.3. To be replaced by rush, crashes,
instability and no fixes only in next releases, that were full of different
bugs. Not even counting the later internet era with the impact of
onmipresent buffer overruns and related attacks.
Accepting the bad costs the world a few billion dollars by the lowest
estimate.
Our options are certainly limited, but IMO it is important to move in the
other direction. And fight the companies from reverting to the ill tactics.
--
[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]