Simon wrote:
Patricia Shanahan schrieb:
Simon wrote:
As a matter of curiosity, did you use a boolean[] or BitSet?
I started with a boolean[], and mine was slower than yours, about 200
milliseconds. It dropped to under 100 milliseconds when I switched to
BitSet. I believe the improvement is due to more efficient caching.
Well, that is strange. I have been using a boolean[], and using BitSet
actually
slows it down on my machine. I am using a Laptop, so CPU speed may
vary. Today
it is even faster than yesterday. The output was (code see below):
The most likely cause of the speed up I saw is more efficient caching,
because BitSet stores the data more densely. That is a very hardware
dependent effect, though less a matter of CPU speed than of cache
configuration.
public void eraseComposite() {
int prime = 2;
while (prime < numbers.length) {
sum += prime;
// mark multiples of prime
for (int i = 2 * prime; i < numbers.length; i += prime) {
// (we could start with prime*prime here, but I didn't
bother
// since it would be larger than Integer.MAX_INTEGER)
numbers[i] = true;
}
// move forward to next prime
prime++;
while (prime < numbers.length-1 && (numbers[prime])) {
prime++;
}
}
}
There is one optimization you missed. When prime is greater than the
square root of numbers.length, there is no need to do the composite scan.
If a number less than 1,000,000 is composite, at least one prime factor
is less than 1000.
This is what I tried to express in the above comment. It was more laziness than