Re: "Small C++" Anyone?
SasQ wrote:
Dnia Fri, 16 Mar 2007 22:40:01 +0000, JohnQ napisa?(a):
Shortly after I graduated I was working in a data procesing
center. We got a tape from a company that used 6-bit characters,
instead of the 8-bit EBCDIC our system used. We had a conversion
chart, but I had to write program in COBOL to do the conversion.
Ugly code. Wouldve been much cleaner in C++.
I can't think of all the things that C++ doesn't take for granted,
such as 8-bit bytes
Because byte doesn't have to be 8-bit. Dennis actually has shown that
there are platforms where bytes are 6-bit. If there were only 8-bit
bytes in C++, it would be impossible to use C++ for that.
You can't implement C++ with 6 bit bytes anyway. Not enough characters
to represent the source character set.
The same problem has emerged with the Inthernet protocols. Because
of that different bit-sizes of bytes on different platforms, all
Internet standards use a term of OCTET for 8-bit packets and BYTE
only for machine-specific bytes. So does C++, but it use a term of
CHARACTER - the count of bits which is possible to store any
character code on the specific platform. It's only a coincidence
that on the most used and most known platforms the 'char' type
has a size of 8 bits, because it use 8-bit bytes for storing
character codes.
It's not a coincidence. It's that C++ compiler writers are generally
sensible people. If you were writing a C++ compiler for an 8 bit byte
machine, and randomly decided you were going to have 'char' be 9 bits
long, your compiler would be a laughingstock.
C++ is hard to learn.
It depends on who is learning it and from what book/person ;J
It's hard to learn, period. Case in point - there's a world's leading
expert on the preprocessor (Paul Mensonidas). If it was easy, why is
there a need for a leading expert on it? Is there a world's leading
expert for 2+2?
Even if you axe templates, operator overloading, and other things
that you don't understand well, the bad programmer still would be
able to write bad programs using 'goto' to produce spaghetti code.
But it's not a C++ fault and any other languages' fault. The knife
can be uset do cut a bread, or to kill someone.
Good language design lines up correct practice with what is convenient
and natural for humans to write. Unfortunately, with C++ the convenient,
natural way is often the deprecated, wrong way. Case in point:
int foo[6]; // convenient, natural, deprecated, bad, boo, hiss
std::vector<int> foo(6); // correct, modern, wordy, tedious