When is it needed to incorporate frequency offset correction in lteDLPerfectChannelEstimate?
2 views (last 30 days)
Hey, I have a few questions regarding the use of time and frequency offset in the function lteDLPerfectChannelEstimate.
My general settings are as follows:
channel.DelayProfile = 'EPA';
channel.DopplerFreq = 0;
channel.InitTime = 0;
[rxwave, channel_info] = lteFadingChannel(channel,[txwave;zeros(25,1)]);
TimeOffset = lteDLFrameOffset(enb,rxwave);
FreqOffset = lteFrequencyOffset(enb,rxwave);
1) First thing I don't understand is why when I set zero Doppler and zero delay I'm still getting that the frequency offset "FreqOffset = lteFrequencyOffset(enb,rxwave)" and time offset "TimeOffset = lteDLFrameOffset(enb,rxwave)" are not 0.
Isn't frequency offset caused only by doppler effects? does the command lteFadingChannel adds additional delay and frequency offsets?
2) Basically, my main goal is getting the most precise (True) channel response. I don't understand when should I incorporate the timeoffset and frequency offset. I think that if Doppler frequency is set to 0 than the channel impulse response should stay constant. But when I use the command:
[hestIdeal] = lteDLPerfectChannelEstimate(enb,channel,[TimeOffset ,FreqOffset]);
I'm getting that the impulse response changes over the time grid.
However, when I ignore the frequency offset and use:
[hestIdeal2] = lteDLPerfectChannelEstimate(enb,channel,[Timeoffset,0]);
I'm getting that the channel response is indeed constant over time.
What am I missing? I couldn't understand whether should I use the frequency offset? why is it not 0 in the first place? Can anyone please clarify me this issue? Thanks.
Iain Stirling on 16 Aug 2018
The purpose of the frequency offset option on lteDLPerfectChannelEstimate is to match any frequency offset correction applied to the received waveform, so that the perfect channel estimate remains well matched with the received waveform. It plays the same role as the timing offset parameter. So if you have no frequency correction on the received waveform, you do not need to set the frequency offset option on lteDLPerfectChannelEstimate.
Note that even when no frequency offset is present, lteFrequencyOffset will give a non-zero value as noise affects its performance. So if you use that small erroneous offset as an input to lteDLPerfectChannelEstimate, it will actually create a frequency offset that is not present in the channel, hence the variation over time.
In summary, if you are not correcting for a frequency offset on your received waveform, you do not need to set the frequency offset option in lteDLPerfectChannelEstimate.