On 10/2/2013 2:48 PM, Gert-Jan de Vos wrote:
On Sunday, 29 September 2013 00:15:35 UTC+2, Mike Copeland wrote:
I have the following data declaration that is causing compiler
warnings:
char DBI[60001] = {'\0'} ;
The purpose of this data is to store a character value ('a'..'I') for
each corresponding record that has been read from a database. The
records are identified by a value that ranges from 1-60000. I can't use
a bitset here, as I need to store the character value associated with
each record. I don't want to be limited by the current supported range
(1...60000).
For mapping a number to a character with any number of entries, this would
be my first choice:
std::map<int, char> dbi;
I wonder what the difference with std::vector<char> would be if *all*
sixty thousand and one numbers in the range need a character. I mean,
the total size of a vector is 60001+sizeof(std::vector<char>) plus the
overhead of allocating the block, which is ~16 bytes. For a map it
would be 60000 * (sizeof(__node_type) + overhead), yes? Whatever
__node_type is, that is. And given that it usually needs to keep at
least three pointers, the value, and [as in VC++] a couple of chars,
that's like 27 bytes... So, is that, like, 40 times the memory? I
don't think it's worth considering unless the array is *really* sparse.
Yes, using a std::map would be a waste of memory. On the other hand: