Reading data from large .CSV files
2 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
hal9k
le 15 Jan 2021
Réponse apportée : Athul Prakash
le 22 Jan 2021
So I am trying to figure out a best and efficient way to achive as shown in pic. I have looked into datastore and other things but I can't seem to find right strategy which will allow me to accomplish this.
I have large csv filesets (each of them are ~10 Gbs) that have timestamps on them. I need to extract certain section based on times (see sample on the right), combine them and create .mat or .txt (or .csv) files.
What would be the best strategy to achieve this? If the file sizes were small, I could easily achieve that by loading and sorting but with massive files I cant seem to do it efficiently.
1 commentaire
dpb
le 15 Jan 2021
Are the timestamps uniform across files so can compute number of records needed for each timestep? If so could just copy that many records sequentially from each to the new output file...
What is to be done in the output file about timestamps -- each file a column or just append or what?
Réponse acceptée
Athul Prakash
le 22 Jan 2021
Hi there,
The use of a datastore is what comes to mind at first - you could try using read to obtain a chunk of data at once and select required fields efficiently using logical indexing..
% something like..
r = read(ds); % 'ds' - tabularTextDataStore
r_sel = r(r.Time<t1 & r.Time>t2, :);
% Save 'r_sel' into another CSV, or add to a running variable.
It's not clear which factor would cause datastore operation to be too slow for your case.
Perhaps it may be faster if you could process one whole csv file at once, allocating all the data into different time slots and then moving on to the next CSV file. You could consider implementing Bucket Sort for the different time stamps.
Finally, you may consider using MapReduce. It's more flexible and powerful than using a datastore, but may require a learning curve. You may come up with the same solution you've tried before on MapReduce and find it working faster.
Documentation: Getting started with MapReduce
Hope it helps!
0 commentaires
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Data Import and Analysis dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!