Memory issues in using xlsread and dataset('XLSFile'...)
4 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I have a huge file in XLSX format. While xlsread runs without memory issues, the command A = dataset(XLSFile, filename) throws the error "Error: Not enough storage is available to complete". Why would this happen ? Any broad guidelines to keep in mind while handling large data will be very helpful.
Thank You
1 commentaire
Peter Perkins
le 8 Mar 2011
Vandhana, it would help to know more specifically what your data are, and the error trace that gets spit out to the command window, if there is one.
Réponses (3)
Walter Roberson
le 6 Mar 2011
I speculate that dataset() reads the data and then converts it in to a different organization, requiring intermediate storage equivalent to the size of the data.
I do not know this to be true, as I do not have that toolbox, but from a code point of view it is more economical to only write the xlsx reading code once and convert the output, rather than writing code to read xlsx for every routine that is going to convert the data, just to prevent potential problems with intermediate storage being exceeded.
0 commentaires
Matt Tearle
le 7 Mar 2011
Do you have a simple way to write the file out in a different format (eg text, such as CSV)? What are the columns of your file? (ie how many and what type)
If you can work with text, you might be able to batch process and/or use non-double types to reduce the memory requirements.
0 commentaires
Nick Haddad
le 3 Oct 2014
This issue is a known bug in MATLAB and has been addressed in the following bug report:
The bug report has a workaround which you can install for MATLAB R2013a through R2014b.
0 commentaires
Voir également
Catégories
En savoir plus sur Spreadsheets dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!