Radar simulation: How to apply a precise delay?
Afficher commentaires plus anciens
Hi all,
I am trying to create a simple radar baseband channel model. Therefore, I want to generate a delayed version of my transmit baseband signal:
The signals are sampled with sampling frequency
, so applying a delay by shifting the signal by a certain amount of samples gives a "delay resolution" of only
. As the location of targets might lead to delays that are no multiples of that value, this approach might introduce an error. Are there ways of applying a delay that are not limited by time discretization (except upsampling, which is already applied), or is this a limitation not to overcome? In particular:
- Is it possible to perform a frequency-dependent phase shift according to the shift theorem of the DFT instead of a time delay? And does this give a finer resolution?
- I saw that there is a FreeSpace object in the Phased Array Toolbox, that also adds a delay to a signal. Does anyone know how this is realized?
Réponse acceptée
Plus de réponses (0)
Catégories
En savoir plus sur Transmitters and Receivers dans Centre d'aide et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!