Re: a missing feature in VC debugger
On Sun, 09 Jul 2006 15:52:53 -0400, Joseph M. Newcomer
<newcomer@flounder.com> wrote:
All lthe more reason for tossing ASSERT statements into it! My own belief is that any
code that has
ASSERT(p != NULL);
p->func();
is erroneous code; if the ASSERT statement could fail, nothing will happen in the release
version and it will still take an access fault. The correct code would be
ASSERT(p != NULL);
if(p == NULL)
... graceful recovery code here
p->func();
But again, you are assuming it is the role of the debugger to detect this (which it is
not, and never could be) or the role of the C++ runtime to detect it (which it could, but
then correct programs might be prevented from running).
Bottom line: correctness, and assertion of correctness, is always the responsibility of
the programmer. If you don't trust a component, make sure that you express that distrust
explicitly. Use ASSERT or VERIFY statements liberally.
joe
But all use of ASSERT boils down to tests that are removed from release
builds. If you are going to handle the errors "gracefully" anyway, I'm not
sure I see a whole lot of reason to use ASSERT; it seems like a
contradiction to me. That is, if I'm going to look for null pointers in
code and handle them, that's an expected condition, and I wouldn't assert
on it. The ASSERT macro is for things that "have to be".
However, I've always questioned the validity of VERIFY. For example,
consider the typical VERIFY:
VERIFY(some_function());
Now some_function is usually a function that could fail due to reasons that
are entirely unpredictable, such as running out of some resource such as
memory. So to use VERIFY in this context is to say, "I kinda sorta care
about errors, but not too much." At least most ASSERTs seem to be properly
written and involve conditions that are either true or false independent of
processing beyond the immediate condition; ASSERT(p != NULL) is an example
of that.
--
Doug Harrison
Visual C++ MVP