Changing large matrices by not completely loading them into memory

3 vues (au cours des 30 derniers jours)
Moritz
Moritz le 18 Juin 2015
Commenté : Walter Roberson le 18 Juin 2015
Hi,
I'm attempting to modify very large matrices (single, 50e3 x 50e3), which don't make sense to load into the memory. I was wondering what you could recommend me as a data handling strategy? I thought ideally I could always load a let us say 100x100 square modify it and write it back. My working machine uses a SSD connected via M2 so it should be relatively speedy (however of course not nearly as fast as RAM). What suggestions do you have?
Thanks,
Moritz

Réponses (2)

Stephen23
Stephen23 le 18 Juin 2015
Modifié(e) : Stephen23 le 18 Juin 2015
You should read TMW's own advice on working with big data:
And in particular you might find memmapfile to be of significant interest to you:
  1 commentaire
Walter Roberson
Walter Roberson le 18 Juin 2015
Or instead of memmapfile, save the .mat with -v7.3 and then use matFile objects to read in portions of the array.

Connectez-vous pour commenter.


Alessandro
Alessandro le 18 Juin 2015
Did you check the sparse command out?
  1 commentaire
Moritz
Moritz le 18 Juin 2015
Yes I did. However, I believe this only works if you have a considerable amount of zero elements. In my case however, the amount of zero elements are < 5%.

Connectez-vous pour commenter.

Catégories

En savoir plus sur Large Files and Big Data dans Help Center et File Exchange

Produits

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by