Re: A Better Choice?
On Wednesday, 2 October 2013 23:24:34 UTC+2, Victor Bazarov wrote:
On 10/2/2013 2:48 PM, Gert-Jan de Vos wrote:
On Sunday, 29 September 2013 00:15:35 UTC+2, Mike Copeland wrote:
I have the following data declaration that is causing compiler
warnings:
char DBI[60001] = {'\0'} ;
The purpose of this data is to store a character value ('a'..'I') for
each corresponding record that has been read from a database. The
records are identified by a value that ranges from 1-60000. I can't use
a bitset here, as I need to store the character value associated with
each record. I don't want to be limited by the current supported range
(1...60000).
For mapping a number to a character with any number of entries, this would
be my first choice:
std::map<int, char> dbi;
I wonder what the difference with std::vector<char> would be if *all*
sixty thousand and one numbers in the range need a character. I mean,
the total size of a vector is 60001+sizeof(std::vector<char>) plus the
overhead of allocating the block, which is ~16 bytes. For a map it
would be 60000 * (sizeof(__node_type) + overhead), yes? Whatever
__node_type is, that is. And given that it usually needs to keep at
least three pointers, the value, and [as in VC++] a couple of chars,
that's like 27 bytes... So, is that, like, 40 times the memory? I
don't think it's worth considering unless the array is *really* sparse.
Of course Victor, a map of 60000 entries would take much more memory
than an array of the same size. It was this part of Mike's post that pointed
me to a map:
"I don't want to be limited by the current supported range (1...60000)"
I understand he has a database where each entry has an id in the range
1..60000, but this range may grow. There is no information about the
number of entries in the database. If it is more than a few % of the
range, an array or vector would indeed be better. Then you need
guarantees about the range of the ids you need to handle.
G-J