Re: the inefficiency of noncontiguous data - why?
On Sep 29, 12:00 pm, wittle <w...@w.com> wrote:
Daniel Dyer wrote:
On Sat, 29 Sep 2007 19:44:45 +0100, wittle <w...@w.com> wrote:
I noticed that breaking up a large array into 2 pieces results in a
much higher memory usage for my program. Example:
A program with:
float[] a = new float[32000000];
uses 138MB of memory on my system.
But a program with:
float[] b = new float[16000000];
float[] c = new float[16000000];
uses 187MB of memory, even though it's the same amount of data.
Why is this? My heap size is 800MB.
How are you measuring memory usage?
Dan.
--Daniel Dyer
http//www.uncommons.org
[Top posting fixed]
Using the windows task manager, "Virtual Memory Usage" column.
Please don't top post, it confuses the conversation...
Windows task manager does not accurately measure Java memory usage.
The JVM's allocation of memory, and the Java programs allocation of
memory don't always correspond one-to-one. It could be that your test
does a garbage collection in one instance, and not in the other... It
would be interesting to compare
float[] b = new float[16000000];
and compare it to
float[] b = new float[32000000];
and see how the memory differs.
Also, see if running the program several times has different results,
or running it inside an IDE, vs running it outside an IDE.