implementing a basic low pass Filter
1 vue (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I have a basic question. I want to implement a simple Low Pass 1 order filter with cut off frequency = 100Hz. Then I want to use this filter on a time domain signal of 1 Hz. I am expecting the output to be same as the input. But, When I convolve the impulse response of the filter with this input signal, I am getting an unwanted gain . I have attached my code. Can you help me to do this .
FS = 1000;
T = (1/FS);
fc = 100;
r = 10 * 1000;
c = 1 / ( 2 * pi * r * fc) ;
num = [(1/(r*c))];
den = [1 (1/(r*c))];
sys_tf_model = tf(num,den);
time = (0:T:10);
signal = sin ( 2 * pi * 1 * time);
impulse_response = impulse(sys_tf_model,time);
output1 = T * conv(signal,impulse_response);
output2 = lsim(sys_tf_model,signal,time) ;
subplot(3,1,1);plot(time,output1(1:1:length(time)));title('Convolution');
subplot(3,1,2);plot(time,output2);title('Lsim output');
subplot(3,1,3);plot(time,signal);title('Input');
1 commentaire
Sudarshan Kolar
le 3 Mar 2017
You’re approximating a continuous integral by a discrete sum. This introduces loss.
Replace the sampling period T by T/10 and you will see that the gap is reduced.
I would not recommend conv for your application.
Réponses (0)
Voir également
Catégories
En savoir plus sur Digital Filter Analysis dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!