Re: What has C++ become?
Ian Collins wrote:
James Kanze wrote:
On Jun 8, 10:58 pm, Walter Bright <wal...@digitalmars-nospamm.com>
Ian Collins wrote:
On my last couple of C++ projects, I was fortunate to enough to be
responsible for both the build farm design and budget as well as the
software design. So neither problem arose :)
Wow, I didn't know people actually used build farms for C++!
How many lines of code was that?
And how many different versions does he need? If you have
separate debug and release versions, for each program, on each
target platform, you can easily end up with ten or fifteen
complete builds. And with enough templates in the header files,
it doesn't take very many lines of source code (less than a
million, even) to end needed a build farm, just to be able to do
a clean build over the week-end.
This project was about 300K lines including tests. A distributed clean
build (which included a code generation phase) took about 12 minutes,
which was too long (10 was the design limit). Any longer and
productivity would have been hit enough to add another node.
I've looked into trying to make the C++ compiler multithreaded (so it
could use multi core computers) many times. There just isn't any way to
do it, compiling C++ is fundamentally a sequential operation. The only
thing you can do is farm out the separate source files for separate
builds. The limit achievable there is when there is one node per source
My experiences with trying to accelerate C++ compilation led to many
design decisions in the D programming language. Each pass (lexing,
parsing, semantic analysis, etc.) is logically separate from the others,
meaning that each can be farmed out to a separate thread. The import
file reads can be asynchronous. The lexing, parsing, and semantic
analysis of an imported module is independent of where and how it is
While the D compiler is not currently multithreaded, the process is
inherently multithreadable, and I'll be very interested to see how fast
it can go with a multicore CPU.