Simulink Transport Delay's influence on Variance Percentage?

1 vue (au cours des 30 derniers jours)
Yufeng
Yufeng le 18 Mai 2012
Dear all,
I have a signal that I put through a transport delay of 0.2 seconds in Simulink. Let's call the time-delayed signal u. Noise is added to the resulting time-delayed signal u to get signal u_noise.
I want to keep the noise in the signal at a variance percentage of 10%, so I calculate
var(u-u_noise)/var(u)*100
When I do not use the Transport Delay I can easily tweak the noise level with a gain to 10%. However, when I use the Transport Delay, the variance percentage drops and when I increase the gain I use to tweak the noise level, suddenly there is a maximum to my variance percentage, which is lower than the 10% I need. I plotted the variance percentage against the gain and in the beginning there is a steep rise until the maximum amount after which is stabilizes. But, as said, the maximum amount is lower than I need (about 7%). Does anybody know why this Transport Delay causes this? In theory it should only delay the signal, so I expected to receive the same variance percentage.
Any help is very much appreciated.
  1 commentaire
Kaustubha Govind
Kaustubha Govind le 18 Mai 2012
I don't know enough about the Transport Delay block to comment on this, but I wonder if you have experimented with increasing the buffer size parameter on the dialog to see if that improves things?

Connectez-vous pour commenter.

Réponses (0)

Produits

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by