Memory issues in using xlsread and dataset('XLSFile'...)

4 views (last 30 days)
I have a huge file in XLSX format. While xlsread runs without memory issues, the command A = dataset(XLSFile, filename) throws the error "Error: Not enough storage is available to complete". Why would this happen ? Any broad guidelines to keep in mind while handling large data will be very helpful.
Thank You
  1 Comment
Peter Perkins
Peter Perkins on 8 Mar 2011
Vandhana, it would help to know more specifically what your data are, and the error trace that gets spit out to the command window, if there is one.

Sign in to comment.

Answers (3)

Walter Roberson
Walter Roberson on 6 Mar 2011
I speculate that dataset() reads the data and then converts it in to a different organization, requiring intermediate storage equivalent to the size of the data.
I do not know this to be true, as I do not have that toolbox, but from a code point of view it is more economical to only write the xlsx reading code once and convert the output, rather than writing code to read xlsx for every routine that is going to convert the data, just to prevent potential problems with intermediate storage being exceeded.

Matt Tearle
Matt Tearle on 7 Mar 2011
Do you have a simple way to write the file out in a different format (eg text, such as CSV)? What are the columns of your file? (ie how many and what type)
If you can work with text, you might be able to batch process and/or use non-double types to reduce the memory requirements.

Nick Haddad
Nick Haddad on 3 Oct 2014
This issue is a known bug in MATLAB and has been addressed in the following bug report:
The bug report has a workaround which you can install for MATLAB R2013a through R2014b.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!