Re: Header File Clutter
On Jan 14, 7:59 pm, Keith H Duggar <dug...@alum.mit.edu> wrote:
On Jan 14, 2:00 pm, James Kanze <james.ka...@gmail.com> wrote:
On Jan 14, 4:40 pm, Keith H Duggar <dug...@alum.mit.edu> wrote:
On Jan 14, 4:41 am, James Kanze <james.ka...@gmail.com> wrote:
On Jan 13, 9:47 pm, Keith H Duggar <dug...@alum.mit.edu> wrote:
On Jan 13, 4:03 pm, Ian Collins <ian-n...@hotmail.com> wrote:
On 01/14/11 09:25 AM, Keith H Duggar wrote:
On Jan 13, 3:11 pm, Ian Collins<ian-n...@hotmail.com> wrote:
On 01/14/11 08:33 AM, Joe Hesse wrote:
When I write many X.h and X.cpp files and debug/modify
them, after a while they contain library or other header
files that may not be necessary. I realize that there
are #ifdef's that prevent a header file from being
included more than once so it doesn't hurt to put them
in more places than necessary.
My question is: Is there a tool to take a C++ program
that compiles and reorganize the header files so there
are no more #include<blah> than necessary?
While it is possible, what real benefits would such a tool bring?
I would have thought it obvious that it can significantly
reduce compile times and especially eliminate dependency
triggers.
Eliminate dependency triggers, yes but significantly reduce
compile times? Include guards take care of most waste
there.
Unfortunately they do not (unless you are talking about external
guards?). The reason is that due to the textual include mechanism
and C++ language rules, internal guards still require that the
entire contents of the header file and recursively on down must
be scanned and tokenized (but not parsed etc).
No they don't. Any decent compiler will note that the included
file had include guards, and not even open it when it is
included a second time. This has been standard practice since
the mid-1990's. (I'm pretty sure that g++ was the first to
introduce it.)
This above statement is simply false. I've come to realize that
when many a Mr. X in this forum writes "any decent" it actually
translates to "my fantasy".
My "any decent compiler" is admittedly a figure of speak. Most
compilers would be quite accurate, however.
I am afraid the devil is in the details for this particular C++
compilation aspect.
Agreed. Defining what is the same file is not trivial, and to
be frank, I don't know exactly what definition g++ (for example)
uses. But the example problematic cases don't generally show up
in actual practice.
No, it's far more complex than just defining what is the same
file. You are totally ignoring the /language specific/ hurdles
that C++ puts in the way.
No I'm not. I've actually used such compilers, and I know what
the problems are. (I've written compilers in the past. This is
not unknown terrain for me.)
When you actually start writing the compiler
reality lays waste to your fantasies. If you would review the
many many discussions /after/ the mid-1990's you would learn
the many reasons why you are wrong and why the problems are only
getting worse not better (as a result of changes in C++ usage as
well as proposed language changes):
Here is a recent article by Walter
http://www.drdobbs.com/blog/archives/2010/08/c_compilation_s.html
giving a short but accurate synopsis.
A not very accurate synopsis. G++ (and if I can trust my
measurements, Sun CC as well) don't reopen an include file
No doubt your measurements (or even more likely the test
cases) are highly flawed. But since you haven't presented them
how can we know?
They might be flawed; who knows. At any rate, they only
concerned specific implementations. On one very large project,
we implemented the Lakos include guards, adding include guards
around the #include. In the case of g++, it made no difference.
We have on the one hand Walter Bright genius compiler indeed the
man who wrote the world's first direct C++ compiler, we have his
opinion and /language specific technical points/, and on the other
we have your claims and ... well nothing to support them. Honestly,
do you not think Walter might just be right and you not?
On one hand, we have Walter Bright, trying to sell a new
language (D), and thus motivated to critize C++, and on the
other, we have people who are actually using the language, and
measuring their compile times.
I don't mean to critize Walter too much (although I've used
compilers he's written, which doesn't give me a really good
impression of his skills). But the fact remains that the
article you quoted contains significant misinformation---either
he's not kept up to date with certain technologies, or he's
doing it deliberately. (Personally, I believe the former.)
The least you could do is at least investigate the technical points
he (and others) have raised (on numerous documented occasions) and
offer specific technical argumentation to refute them rather than
offering up such bald claims as "a not very accurate synopsis",
"any decent compiler" yada.
I have investigated the technical points he's raised, and I did
post an explicit technical argument which refutes them: g++.
I'm not talking about some theoretical possiblities here; I'm
talking about things that are actually implemented in real
compilers, and that aren't really even state of the art. (G++
implemented this at least as early as the mid 1990's, since
that's when we looked into the issue.)
protected by include guards more than once, regardless of how
many times it has been included.
So (sadly for all of us C++ users) you are wrong; James' fantasy
compiler does not exist; Walter is right.
My "fantasy" compiler is called g++. The source code is
available, so you can even see how it does it.
It doesn't do it. It's up to you to point out the g++ code
that you claim does (or GNU documentation making the claim)
to prove your claim. However, giving that you just wrote
Because it can? Who knows why g++ does anything;
else thread, I'm highly skeptical you know much of anything
at all about current g++ implementations.
I don't look at g++ source code, if that's what you mean. I do
use g++, and do keep in contact with what it does and does not
do. It's true that I last looked at this issue some time ago,
but I don't see any reason for g++ to have changed its strategy.
Would that be fair
to say? Have you looked at g++ source in the last 10 years?
Visited the g++ development forums regularly? Written any
C++ compilers like Walter has?
I've written C compilers (which have the same problem), and I've
talked with people who contribute to g++ (albeit not recently).
I've also measured the differences (again, not recently).
In fact, I'm very curious, what evidence /at all/ are you
basing your claims on? Faith? Wishful thinking?
Claims by g++ developers, and actual measurements. (Also, some
knowledge of compilers---I know that for a given meaning of
"same file", the implementation is pretty simple.)
Also, fairly recent discussions in the C++ standards committee.
There was some consideration given to standardizing #pragma
once. It was rejected because from a standards point of view,
you would have to define what it meant to include the same file,
and there doesn't seem to be any valid portable definition. And
for any non portable, implementation specific definition, it's
not necessary; the g++ algorithm works just as well.
--
James Kanze