Slow tdmsread Performance with Large Dataset - Seeking Solutions to Reduce Sampling Rate

14 vues (au cours des 30 derniers jours)
Kunpeng
Kunpeng le 16 Juin 2025
Commenté : Kunpeng le 18 Juin 2025
Hi everyone,
I'm encountering significant performance issues when using tdmsread to process a large TDMS file (1.2Gb) with the following specifications:
  • NumSamples: 80,000
  • ChannelName count: 7,000
Currently, reading all channels takes an unacceptably long time. Since I need to analyze all 7,000 channels, I'm exploring ways to reduce the sampling rate (e.g., from 80,000 to 10,000 samples) to speed up the process.My Questions:
  1. Is there a built-in way to specify a lower sampling rate when using tdmsread or tdmsDatastore? For example, can I directly read every 8th sample to achieve a 10x reduction?
  2. If not, what's the most efficient approach to downsample the data during reading rather than after loading the entire dataset into memory?
  3. Are there alternative functions or workflows (e.g., using datastore properties) that can handle this scenario more efficiently?

Réponses (1)

Shishir Reddy
Shishir Reddy le 18 Juin 2025
Reading a large TDMS file with thousands of channels and high sample counts can lead to significant performance bottlenecks.
Currently, 'tdmsread' and 'tdmsDatastore' do not support direct downsampling (such as reading every Nth sample) during file access. They are designed to load the raw data directly from the file.
To avoid loading the entire dataset into memory, consider combining 'tdmsDatastore' with a 'custom read function' that reads data in chunks and performs downsampling on-the-fly which is demonstrated in the following code snippet.
ds = tdmsDatastore("yourfile.tdms");
ds.ReadSize = 1000; % Tune based on available memory
while hasdata(ds)
dataChunk = read(ds);
dataChunk = dataChunk(1:8:end, :);
% Further processing or saving
end
As there is no out of the box parameter to downsample directly in 'tdmsread', but using 'tdmsDatastore' with on-the-fly processing is the best option for memory efficient downsampling.
For more information regarding 'tdmsread' and 'tdmsDatastore' functions, kindly refer the following documentations
I hope this helps
  1 commentaire
Kunpeng
Kunpeng le 18 Juin 2025

Dear Shishir,

Thank you for your reply and assistance – I truly appreciate it. Currently, I’m facing an issue with slow reading speeds, and the time it takes to read files has exceeded my acceptable range. I’m now exploring other software to achieve fast reading of large files.

Thanks again for your help!

Best regards, Kunpeng

Connectez-vous pour commenter.

Catégories

En savoir plus sur TDMS Format Files dans Help Center et File Exchange

Produits


Version

R2024b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by