Re: streaming to unsigned char stringstream
zero wrote:
std::basic_stringstream<unsigned char> theStream;
Stop here: if this compiles at all, it is an extension. The point is that
the standard only requires that streams for char and wchar_t are provided.
Often, this compiles because basic_stringstream is just a template and
adapts to the type, though also often the char_traits class lacks a
specialisation.
theStream << i;
What happens here is that the num_put facet is looked up inside the stream's
locale. This locale is by default a copy of the system's locale. To be a
bit more precise, it looks up the num_put<unsigned char, ...> facet, i.e.
one that depends on the character type. This lookup is _not_ a template
mechanism but a runtime mechanism, so the fact that the facet itself is a
template doesn't change anything.
I would expect this to output "5" to stdout. Visual Studio does exactly
that, as does Apple GCC 4.0. However, Linux GCC (tested with several
versions) gives an empty line as output.
When debugging, I found that the line "theStream << i;" caused a bad_cast
exception inside the ostream code, which in turn caused bad_bit to be
set, and the stream contents remained empty.
The bad_cast is a sign that the lookup of the num_put facet failed.
From experience, I have learned that Linux gcc is usually closer to the
spec than Visual Studio or Apple gcc, but in this case I can't see why a
simple streaming operation should cause a bad_cast. What do you guys
think?
All three implementations are standard-conformant as far as that aspect is
concerned. VS and Apple seem to ship an extension though.
Uli
--
Sator Laser GmbH
Gesch??ftsf??hrer: Thorsten F??cking, Amtsgericht Hamburg HR B62 932
[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]