unable to import large data txt file due to memory
2 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Hi,
I have been struggeling to import the data I need to process in MATLAB. It contains 3 fundamental cycles each of 20 ms, and I just need the last 20 ms. However, MATLAB wil still not allow me to import that data section. I have tried to increase the JAVA Heap size, but still not enough. Is there a way I can process my data without loosing too many valuable data points?
The file is 18.142.928 KB, each variable in the set has 101799346 values. The data goes from 0 to 60 ms, so I guess i only need 2/3 og the dataset and I only need 3 variables. Again, I have tried to just take one variable at at time, but 2/3 of the variable is still too large.
Réponses (1)
Venkat Siddarth Reddy
le 6 Mai 2024
Modifié(e) : Venkat Siddarth Reddy
le 6 Mai 2024
Hi,
You can try using "datastore," which is designed to store data that is too large to fit into memory. This enables you to read data in smaller portions that fit in memory, i.e., it facilitates the incremental import of data into memory, allowing users to access portions of the data sequentially.
To learn more about datastore, refer to the following documentation:
I hope it helps!
0 commentaires
Voir également
Catégories
En savoir plus sur Large Files and Big Data dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!