Eric Sosman wrote:
j1mb0jay wrote:
I am currently working on a dictionary populating program. I currently
have a socket connection my local news server and am trawling through
all of the articles looking for new words. Java's String class has a
method that hashes strings. I was wondering if i should still be using
these even though I have over two million words in the hash table.
Although the hash table is currently Big 0(4).
This makes no sense. O(4) = O(1) = O(0.01) = O(1000000),
by definition. What do you really mean?
I am using the Multiply Add and Divide (MAD) method for the
compression of the hash code, does Java have any built in
functions(methods) that will do this for me, or does anyone know of a
more efficient way?
The value delivered by hashCode -- for any class, not
just for String -- is a Java int, 32 bits wide. How (and why)
are you "compressing" this value?
My hash table is made up of an array of n LinkedLists (where n is a
prime number that is roughly double the number of words in the dictionary).
I firstly use the String.hashCode() method on a given word. I then
compress this number so that i can use it as a index into the array of
LinkedList; as this 32bit number is often far to large. I then insert
the word into the LinkedList array at the compressed value index(The
fact the hashTable is an array of LinkedLists is so that it handles
collisions)
After inserting all of the words into the dictionary the largest
LinkedList in the array has only four elements. I thought Big O(4) was
the correct way of describing this.
Would it help if i posted my classes on here, or offer you a place to
download the program.
j1mb0jay
consider a prefix tree data structure instead.
steps, then it is O(N*N). If it takes 4*n*n steps, it is still O(N*N)