Ram overflow and long loading times, SQL query big data
3 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I have an existing database from which I need to extract a single record that contains a total of 10 GB of data. I have tried to load the data with
conn = sqlite(databaseFile, 'readonly')
GetResult = [
'SELECT result1, result2 ...... FROM Result '...
'WHERE ResultID IN ......'
];
Data = fetch(conn, GetResult)
With this query, the working memory increases (16GB) until it is full, and then the software crashes.
I also tried to limit the result with
'LIMIT 10000'
at the end of the query and browse the results by offset. This works, but it takes 3 hours (calculated from 20 individual results) to get all the results.
0 commentaires
Réponses (1)
Dinesh
le 22 Déc 2023
Hi Johannes,
To manage the large dataset you're working with, it's best to import the data into MATLAB in batches to avoid out-of-memory issues or slow processing. Since you're working with an SQLite database, MATLAB's memory management can restrict you from loading the entire 10 GB at once. To handle this efficiently, use the "fetch" function with the "MaxRows" argument to limit the number of rows your query returns in each batch. This method allows you to control the amount of data processed at any one time and can help speed up the overall data import operation.
The following link will help you:
0 commentaires
Voir également
Catégories
En savoir plus sur Database Toolbox dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!