VHTMIMOPac​ketErrorRa​teExample - Why is LDPC on IEEE 802.11ac worst than BCC ??

1 vue (au cours des 30 derniers jours)
Jérôme Härri
Jérôme Härri le 6 Sep 2017
Hi All, I ran the WLAN System Toolbox example VHTMIMOPacketErrorRateExample, which basically provide a PER simulation of IEEE 802.11ac on a 8x8 MIMO configuration based on BCC as channel coding.
Now, I tried to use LDPC instead of BCC, and I was expecting it to perform a bit better (it does on IEEE 802.11n). But the output is totally worse...
Would anybody know where this poor performance could come from? I used the based TGacChannel and did not change anything from the example beside the channel coding.
Thanks in advance,
BR,
Jérôme

Réponses (2)

BABA CHOUDHURY
BABA CHOUDHURY le 2 Jan 2019
Hi Jerome, I know its very late now to respond to your query.
Still, I was also running similar simulations and found LDPC to outperfrom BCC in every scenario. Maybe some other parameters is affecting the calculations.

Darcy Poulin
Darcy Poulin le 12 Août 2020
I had exactly the same issue, and spoke with Mathworks support.
It turns out that you should configure the LDPC decoding method to use 'norm-min-sum' rather than the default 'bp' algorithm. When I made this change, I saw the predicted improvement in link performance.
For 11ac, you configure it like this:
rxPSDU=wlanVHTDataRecover(vhtdata,chanEst,nVarVHT,cfgVHT, 'LDPCDecodingMethod','norm-min-sum')
The same thing occurs in 11ax. Here you configure it like this:
rxPSDU = wlanHEDataBitRecover(eqDataSym,nVarEst,csi,cfgHE,'LDPCDecodingMethod','norm-min-sum');

Catégories

En savoir plus sur Wireless Communications dans Help Center et File Exchange

Produits

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by