Re: Type length in bits?
On Apr 29, 1:58 am, "Matthias Hofmann" <hofm...@anvil-soft.com> wrote:
"Alf P. Steinbach /Usenet" <alf.p.steinbach+use...@gmail.com> schrieb im
Newsbeitragnews:ip0bud$639$1@dont-email.me...
* Angel Tsankov, on 24.04.2011 07:45:
What is a standard way to get the "length in bits" (in the sense of
5.8/1) of a type?
sizeof(T)*bitsPerByte
where 'bitsPerByte' can be defined as e.g.
#include <limits.h>
int const bitsPerByte = CHAR_BIT;
or
#include <limits>
int const bitsPerByte = std::numeric_limits<unsigned char>::digits;
The latter is both more verbose and more subtle.
Note that this bit length does not in general guarantee to tell you the
number of value representation bits, but mostly that's an academic, not a
practical, issue.
What happens if for a type T, the number of object representation bits
exceeds the number of value representation bits and I do this:
// Determine the number of bits used for the value representation.
const int num_bits = sizeof ( T ) * std::numeric_limits<T>::digits;
// The bit pattern of x will probably be all
// zeros except for the least significant bit.
T x = 1;
// Shift the least significant bit out of range of
// the bits used for the value representation.
x <<= num_bits;
Undefined behavior. However...
Let us assume that T is a 18 bit integer type that uses 16 bits for the
value representation like this ( O = object representation only, V = value
representation):
OOVVVVVVVVVVVVVVVV
The first line will set the value of num_bits to 16. The second line will
initialize x to contain a value like ( U = undefined ):
UU0000000000000001
In the actual case I know of (Unisys MCP), UU will be guaranteed 0.
The thirs line will shift the least significant bit 16 positions to the
left, so it ends up at the position of the least significant bit used for
the object representation only:
U10000000000000000
According to what you've just said, the top two bits aren't
visible, so it doesn't matter. (In this particular case, it
doesn't matter anyway, since there was undefined behavior. But
if you'd started with UU1000000000000000, just shifting 1 would
create the same situation.) In practice: either the bits must
have some specific value (maybe parity), and the hardware will
ensure that they do, of they are indifferent, in which case,
they are indifferent.
Now what is the value of x according to the standard? Am I
right in thinking that it evaluates to zero, because any bits
that are shifted out of range are simply discarded? Or is it
possible that the bit pattern now is a trap representation on
certain systems, resulting in undefined behaviour?
You cannot get a trap representation shifting a legal value a
legal number of places.
--
James Kanze
[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]