MATLAB Answers

0

urlread-function is very slow since R2012b - why?

Asked by Sven Meister on 11 Nov 2012
Latest activity Commented on by James
on 13 Jan 2014
Accepted Answer by Jan
Hi,
I'm using a small piece of code to read out information from about 100 websites by using urlread in its easiest form:
for count = 1:elements
html_str = urlread(['http://www.website.com/page' num2str(count)]);
end
Yesterday I changed my Matlab-version from R2011a to R2012b and was slightly shocked. With the old version the loop took about 20-30 seconds. The new one needs nearly 5 Minutes!
My research quarry that Mathworks has expanded the urlread-function with 3 additional parameters and took some old ones away. http://www.mathworks.de/de/help/matlab/release-notes.html But I do not use any of these parameters, only the naked function. I tried the manual reduction of the 'Timeout' to 0.1, but with no influence.
Are there any ideas, what causes the speed-loss and first of all how I can get my old performance back? Thanks a lot for your assistance!

  0 Comments

Sign in to comment.

Products

3 Answers

Answer by Jan
on 11 Nov 2012
 Accepted Answer

Why do you assume that the problem is caused by Matlab? Do you have the old Matlab still installed? If so, please check if the old version is slower today also.

  5 Comments

Jan
on 12 Nov 2012
Please avoid cross posting. It wastes the time of the ones who voluntarily try to help. Thanks!
Aha - I found the crossposting Jan mentioned on gomatlab.de :-)
Looks like you found a solution, can you share what helped? And I found Wni8 IS supported. Seems to be back qualiified: http://www.mathworks.com/support/sysreq/current_release/

Sign in to comment.


Answer by Clemens
on 28 Nov 2013

  1 Comment

Solved my problem too - on testing, urlread2 is around 10 times faster than urlread.

Sign in to comment.


Answer by John Hedengren on 8 Nov 2013

I also noticed this loss in speed that Sven mentioned on some of our lab computers. We were running 2010b and 2012b. The same code would take 2-3 seconds on 2010b and over 5 minutes on 2012b so we naturally used 2010b. We use a web-service architecture in the APMonitor Optimization toolbox so some users switched to APM Python because of the performance decrease. I recently tried a modified version of urlread with just the basic function calls (see below). It seems to hang up when it tries to get a response from the server.
inputStream = urlConnect.getInputStream;
Any suggestions would be greatly appreciated. The code that we are using is online at APMonitor.com. Running test_all.m should run a number of example problems. For some users it is fast but for others it is extremely slow.
function output = urlread_apm(str)
try
if usejava('jvm'),
import com.mathworks.mlwidgets.io.InterruptibleStreamCopier;
% use Java if available
url = java.net.URL(str);
urlConnect = url.openConnection;
timeout = 1000000000;
method = 'GET';
urlConnect.setRequestMethod(upper(method));
urlConnect.setReadTimeout(timeout);
urlConnect.setDoOutput(true);
outputStream = urlConnect.getOutputStream;
outputStream.close;
try
inputStream = urlConnect.getInputStream;
isGood = true;
catch ME
inputStream = urlConnect.getErrorStream;
isGood = false;
if isempty(inputStream)
msg = ME.message;
I = strfind(msg,char([13 10 9]));
if ~isempty(I)
msg = msg(1:I(1)-1);
end
error(msg)
end
end
byteArrayOutputStream = java.io.ByteArrayOutputStream;
isc = InterruptibleStreamCopier.getInterruptibleStreamCopier;
isc.copyStream(inputStream,byteArrayOutputStream);
inputStream.close;
byteArrayOutputStream.close;
output = char(typecast(byteArrayOutputStream.toByteArray','uint8'));
% output = typecast(byteArrayOutputStream.toByteArray','uint8');
else
% urlread is sometimes slower after R2012a
% use only if Java is not available
output = urlread(url);
end
catch Err
output = urlread(url);
end
end

  0 Comments

Sign in to comment.