Re: Creating large arrays..
* hamishd:
On Jan 28, 11:58 am, "Alf P. Steinbach" <al...@start.no> wrote:
* hamishd:
What is the best way to store large arrays of numbers (eg, 4-byte
integers)?
Say I want to store an array of 1billion 4-byte integers.. If my
computer has > 4gB memory, then is this possible?
Depends on the computer and C++ implementation.
int Large_Array[1000000000];
This will cause a "stack overflow" when i try and execute. How can I
do this?
You might try to allocate it dynamically, by using
std::vector<int> Large_Array( 1000000000 );
However, if you're skirting the limits of your computer's memory
capacity, then it's not a good idea even if it seems to work.
thanks.
You'd be better off using some disk-based structure, processing only
parts of it at a time.
What are the quicker ways of doing this?
Depends on what you're doing.
E.g. it might be that what you're doing best fits some kind of tree
structure (note: a B-tree isn't a binary tree :-)).
Or it might be that what you're really doing is just some kind of
sequential processing, in which case read in suitably large chunks of a
file at a time, process them, and write them back or to some other file.
If it weren't for the size problem a memory mapped file would be ideal,
and IIRC the Boost library provides some fairly portable support for that.
However, the size problem means resorting to old-fashioned reads and
writes, and the clue there for efficiency is to do large enough chunks,
and perhaps also preferentially going down to at least the C file i/o
level instead of C++ iostreams (but it might be good idea to measure!).
Cheers, & hth.,
- Alf
--
A: Because it messes up the order in which people normally read text.
Q: Why is it such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?