limits in array size for filtfilt?

5 vues (au cours des 30 derniers jours)
actinium
actinium le 5 Mar 2019
Modifié(e) : actinium le 6 Mar 2019
just wondering if there is a hard size limit for the filtfilt function? I was looking at a ECG time series collected at 500Hz over several hours, with an array of ~14e7 entries in double. In the process of extracting the QRS complex i first run through a Butterworth bandpass filter, using N = 3, Wn = [5 15]*2/fs in the butter(N,Wn) function. It runs fine up to 6810691 entries, anything over it gives all NaN output. Just wondering have I missed something obvious? I am running on a machine with 16gb ram, i7-4xxx, and during the process it haven't hit the memory/cpu limits. Thank you!
Regards,
Alan
edit: I have included the data file (~55mb)

Réponse acceptée

Star Strider
Star Strider le 5 Mar 2019
Use the isnan function to check to be certain that none of the data are NaN:
NaNsum = nnz(isnan(your_signal))
The first NaN will propagate throughout the rest of the filtered signal. If this result is 0, I have no other suggestions or an explanation.
If there are any, use:
NaNidx = find(isnan(your_signal));
to discover their locations.
As a side note, the normal EKG (without arrhythmias) has a spectrum of 0 to 50 Hz, so the passband you’re using will eliminate a significant amount of detail.
  4 commentaires
actinium
actinium le 6 Mar 2019
when i split the data into smaller arrays it works fine. that's why I'm wondering if there is a hard size limit.
Star Strider
Star Strider le 6 Mar 2019
I never encountered the problem you’re reporting. The documentation for filtfilt doesn’t mention anything about support for Tall Arrays (link), assuming your data meets those criteria, although filtfilt is not listed amongst the Functions That Support Tall Arrays (A - Z) (link) either. The documentation also doesn’t mention any sort of length restriction.
If filtfilt works with shorter vectors, consider concatenating the various segments you successfully filter, providing your code does not generate any significant transients on either end of any segment. There may also be other options, such as designing your own overlap-and-add approach.
I have no idea what the problem could be.

Connectez-vous pour commenter.

Plus de réponses (1)

Walter Roberson
Walter Roberson le 6 Mar 2019
Yes, there is a hard size limit in filtfilt. Data less than 10000 points is put through a different routine that handles endpoints differently. I suspect the routine for the larger arrays exists to be more space efficient, but I am not positive.
Other than that: NO, there is no built-in limit other than what your memory can hold.
Note that several temporary arrays are needed during filtering, so do not expect to be able to process anything more than roughly 1/5th of your memory limit.

Catégories

En savoir plus sur Time-Frequency Analysis dans Help Center et File Exchange

Produits


Version

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by