Re: memory leak?
On Mon, 30 Jul 2007 22:53:33 -0400, Joseph M. Newcomer
<newcomer@flounder.com> wrote:
Since Sam was on the ANSI C committee, and claimed that it was, in fact, conforming, I
believe him.
I wasn't there for the discussion so can't comment on what was discussed.
Maybe he was asserting conformance on a "freestanding" (as opposed to
"hosted") basis. Maybe he was unaware of the issue I've been talking about.
Note that the range of char values is 0..FF, 0..FFFF or 0..FFFFFFFF, so
realistically, the formal notion that it must hold *all* unsigned char values is
apparently not sufficient to invalidate the implementation, since in practice the only
char values are limited to the above ranges.
The char type can have 9 bits, in which case, unsigned char can hold
0..0x1FF, because it is also 9 bits. In any case, I don't see the
relevance.
Again, the issue is whether or not you believe that there are 2**80 possible character
values. In the real world, character values in the very worst case cannot occupy more
than 32 of the 80 bits.
The issue is unsigned char values stored in ints, along with a
distinguishable int value EOF. All I can suggest is that you read my last
two messages. I can't explain it any better than I already have. All bits
of unsigned char participate in its value representation, so if unsigned
char has 80 bits, there are 2**80 valid unsigned char values. It's that
simple.
(Also note that a conforming implementation does not require that
char be signed).
What does that have to do with anything? It's funny, you say stuff like
that as if you think I wouldn't know it, and it's completely irrelevant
anyway.
I'll send him some email and ask how he resolved this
I wish you wouldn't. The last time you did that, you ended up with the
Harbison/volatile/multithreading debacle. If you doubt what I've posted,
I'd much prefer you take it to comp.std.c, comp.lang.c++.moderated, or
comp.std.c++, all moderated groups where a lot of experts hang out. I'd be
happy to join you there. Whatever you do, please, don't paraphrase anything
I've said. Quote exactly what I wrote in my last three messages, in their
entirety. Thanks.
--
Doug Harrison
Visual C++ MVP