Re: Speeding up URLConnection
mark13.pl@gmail.com wrote:
Hello,
I want to save as a string html file. My code looks like that:
URL url = new URL(fileName);
URLConnection conn = url.openConnection();
conn.setRequestProperty("Cookie", myCookieCode);
conn.connect();
BufferedReader dis = new BufferedReader(new
InputStreamReader(conn.getInputStream()));
String inputLine = null;
for(;;) {
inputLine = dis.readLine();
htmlCode += inputLine;
if (inputLine == null) break;
}
It works perfectly (in htmlCode I have the whole page as I wanted) but
it has very big disadvantage - it is VERY slow. In both my browsers
(Firefox, IE6.0, cache cleaned) it takes about 2seconds to load it
while in java: about 14seconds. Do you know where is a problem and how
can I speed it up??
While I agree with Oliver that your String concatenation code is pretty
inefficient, I wouldn't think it could cause that kinds of delays
(unless the number of lines was very very large).
One probable cause ( a guess really) is that the HttpUrlConnection
class tries to keep the connection open in anticipation of futher
traffic to the same URL, and so the reader returning null (indicating
the end of output) is delayed till the connection is eventually
dropped. Browsers on the other hand would keep the connection open, but
would look at headers to see the length of the expected response, and
terminate the read when the expecteed number of bytes were received.
I suggest that you try out another client (like say jakarta's
HttpClient) , and see if it helps. Sun's implementation had quite afew
issues like this in the past - and was really never meant to be
full-fledged HttpClient - but rather a simplistic mechanism for loading
Jar files from a remote source.
BK