Error using gunzip function

Hi,
I have been using gunzip to process large quantities of data as shown below. I have about 100,000 files per year and 14 years to do.
This does exactly what I need - decompresses the .gz file and returns it to the original folder using exactly the same name but in .dat format.
files=gunzip('*.gz');
I have successfully (and very slowly) decompressed the first 7 years worth. However, I now keep getting the following error:
Error using gunzip>gunzipwrite (line 234)
Unexpected end of input stream when attempting to GUNZIP the file "metoffice-c-band-rain-radar_uk_201103071320_1km-composite.dat".
Error in gunzip>gunzipEntries (line 154)
names{k} = gunzipwrite(entries(k).file, outputDir, baseName, streamCopier);
Error in gunzip (line 90)
names = gunzipEntries(entries, outputDir);
Error in open2011 (line 1)
files=gunzip('*.gz');
Is there anything I can do to fix this error? When I restart the code, it decompresses the files that have already been decompressed and stops at the same error eventually.
Also, any suggestions in terms of speeding up the decompression process would be great. Also, it might be worth noting that I am using a mac.
Thanks in advance.

6 Comments

Have you tried to save it in a different folder? Not in the same folder.
I haven't done so, could that possibly be the source of the error? It worked fine until I reached the 8th year to compress. I thought perhaps it was the data itself, but have tried with other years and I keep now having the same issue...
Disk full?
My computer has over 250GB available and I am working from a 2TB hard drive with 1.5TB of space available. After each year has been decompressed I extract the data I need from it and delete the files. A year's worth of decompressed data will take up about 750GB of storage but usually gets deleted straight away.
Is there perhaps an alternative method that is relatively efficient that achieves the same function as gunzip?
I recommend using an external unzip instead of using gunzip(). gunzip() uses java methods and so is limited to java efficiency and has runs the risk of exceeding allocated java memory.
Thanks for your suggestions. The main issue that I have encountered with this is that due to the large number of files, the programs that I had a go running so far eventually crash.
The mac Utilities actually works pretty well and all you have to do is set it as the preferred program to open .gz files. And just by the process of trying to open a .gz file, it creates a .dat file next to the original one. However, you can only select about 500 files at a time (otherwise it crashes) so it is unfeasible to do this.
Is there a way I can work around this issue by perhaps using matlab to decompress using the utility software?

Sign in to comment.

Answers (0)

Products

Release

R2018a

Asked:

ASJ
on 13 May 2018

Commented:

ASJ
on 14 May 2018

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!