Re: Memory & Pixel Usage Etc: As Number of JTable Rows & Columns Increases
Eric Sosman wrote:
Haircuts Are Important wrote:
Disregarding garbage collection, what do you think about "ever" using
freeMemory() as an approximation for memory useage:
Runtime.getRuntime().gc();
Ha.
long before = Runtime.getRuntime().freeMemory();
Long before what?
long after = Runtime.getRuntime().freeMemory();
Long after what?
"Disregarding garbage collection," freeMemory() is a reliable
way to measure a meaningless quantity. Disregarding gravity, how
high can you jump?
It's possible to use totalMemory() and freeMemory() and a lot
of care (calling gc() "suggests" garbage collection and promises
only a "best effort") to estimate the sizes of simple objects like
Integers, fixed-length Strings or arrays, and so on. But with more
complex objects containing references to other complex objects it
becomes much more difficult. You seem to be concerned about how
much memory a JTable uses -- well, does that include the TableModel?
The data in the TableModel? The TableColumnModel, the internal
HashTables that store editors and renderers, the editors and
renderers themselves, ...? You've got a fairly thorny definitional
problem before you can even get started.
Ah, the heck with it. Here's the output of a micro-benchmark
I wrote some time ago, estimating the size of a `new JTable()' by
fitting least-squares lines relating "instance count" to "heap used:"
500 instances: 5247.664 bytes each
1000 instances: 5248.128 bytes each
1500 instances: 5220.9584 bytes each
2000 instances: 5220.9328 bytes each
2500 instances: 5224.7922285714285 bytes each
3000 instances: 5228.693142857142 bytes each
3500 instances: 5231.899428571429 bytes each
4000 instances: 5234.469066666667 bytes each
4500 instances: 5236.533624242425 bytes each
5000 instances: 5238.165672727273 bytes each
BUILD SUCCESSFUL (total time: 14 seconds)
There you have it -- but *what* do you have?
A meaningless benchmark of a meaningless quantity that lacks predictive value.
As you point out. Complicated by the fact that the same structure in a program
can occupy not just somewhat different amounts of memory, as you show, but
wildly different amounts. Optimization can remove an object from heap altogether
under certain circumstances, circumstances that can change during program execution
and cause that same structure to occupy heap at other times.
That is over 100% reduction in size, depending on which of the "in-heap" memory
numbers you use as 100%.
And "Haircuts", for God's sake stop deleting your posts. What in the heck is wrong with you?
--
Lew