Re: We do not use C++ exceptions
on Sat Jan 24 2009, Andrei Alexandrescu <SeeWebsiteForEmail-AT-erdani.org> wrote:
David Abrahams wrote:
on Wed Jan 21 2009, Andrei Alexandrescu <SeeWebsiteForEmail-AT-erdani.org>
wrote:
It doesn't help that much for overall system reliability if you're
trying to make it resilient against programmer errors, but personally I
think that approach is a dead end. Systems that try to be resilient
quickly become messy and unmaintainable due to the extra resiliency
code, which can almost never be properly tested. Rather, I prefer to
concentrate on making it less likely that programmer errors will occur,
and one way to do that is to build as much information about
preconditions as possible into the parameter types.
Using unsigned pushes the question to the boundary between the function
and its caller rather than allowing the question to occur inside the
function where it complicates code. Callers of functions already need
to understand the relationship between argument and parameter types and
watch out for narrowing conversions (which occur even with signed
types), so it doesn't make for a new point of unreliability.
Meh, problem is that using unsigned in function interfaces has little
effect. Granted, saying:
void fun(unsigned x);
is a rather concise way of saying:
// You better don't pass a negative integer!!! I will consider
// small negative integers large positive numbers!!!
void fun(int x);
Exactly.
because the signature still allows things like:
unsigned int x = 3;
fun(x - 10);
Got integers only? No problem. "Give me your tired, your poor, your
huddled integers of any size and signedness. I'll take'em."
signed int x = 3;
fun(x - 10);
fun accepts pretty much *any* integral with a pulse (except long when
narrowing is an issue), and the code compiles flag-free. On fashionably
rare occasions, the compiler wakes from a coma and mumbles something
about potential signedness issues (most often when there aren't any
issues at all).
I'm not excusing liberal signed/unsigned inter-conversions. We have to
live with those. What we don't have to live with is complicated checks
inside functions for conditions that are preventable at the function
interface boundary.
So while it is nice that unsigned can be put in the signature as a
concise statement of expectations, indeed that is little else than a
comment, because the compiler does little in the way of enforcing said
expectations.
Nor does it *ever*, since anyone can define a type with an implicit
conversion to your argument type.
One problem with unsigned is that small negative integers (which occur
frequently in code) convert automatically to large unsigned numbers.
This problem is partially offset by the fact that large unsigned numbers
are rather rare and can be properly flagged as errors (e.g. when used as
array indices). But when the unsigned is used to do some math or
allocate memory, bizarre results are easily within reach.
Sure.
What we want to experiment with in D is disabling the most dangerous
conversion (int -> unsigned) and see how restrictive the resulting
conversion graph is.
Seems like a reasonable tack, but this thread was about what to do in
C++, not how to write a better language.
The true solution is to use a flow-sensitive value range propagation
analysis; that will associate with any number a possible range at any
point, which will catch many potential problems without weeding away
many correct uses. That's difficult to implement, but I guess as soon
as I'll bring up the opportunity of slashing in two the size of any
codebase, the motivation will be there :o).
Yeah, that would be super nice. You could actually do that with C++,
right?
It's too bad that it is considered outside the domain of this library:
http://student.agh.edu.pl/~kawulak/constrained_value/constrained_value/rationale.html#constrained_value.rationale.overflows
--
Dave Abrahams
BoostPro Computing
http://www.boostpro.com
[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]