Effacer les filtres
Effacer les filtres

the coefficients of the 3rd-order predictor

2 vues (au cours des 30 derniers jours)
ahmed allaheani
ahmed allaheani le 8 Mar 2018
hello everyone :
i am working on an FIR predictor
i got the following sequence x(n)=[-1 -3 5 2]
and this is the code to get the autocorrelation:
x=[-1 -3 5 2];
P=length(x)-1;
rxx=xcorr(x)
N=[-P:P];
stem(N,rxx);
xlabel('Index');
ylabel('rxx');
title('The autocorrelation of x');
##############################################
and the result sequence is rxx = [-2.0000 -11.0000 -2.0000 39.0000 -2.0000 -11.0000 -2.0000]
Q/ how can i Find the coefficients of a third-order predictor???
thanx for help

Réponse acceptée

Abraham Boayue
Abraham Boayue le 9 Mar 2018
Hey Ashmed
You might find this code useful if you are thoroughly familiar with the literature that you are studying; else this code might be difficult to grasp as it contains some serious mathematical formulations. Be advised that this is not a solution to your problem, but a guide to walk you through. The code solves a linear prediction problem with just two coefficients, i.e the coefficient vector of x is [-0.95 0.90], it is valid for any size of vector x. You can also use the matlab built in function called lpc (read about it) to solve your problem. Ask me any question if you get stuck.

Plus de réponses (1)

Abraham Boayue
Abraham Boayue le 9 Mar 2018
function [e,w,y,W] = PreDLMS(x,mu,M)
% [e,w,y,W] = PreDLMS(x,mu,M): Implements lms for linear prediction
% Inputs: x: input signal, mu: Step size, M : filter order
% Outputs: e: error signal, w: estimated coefficient vector, y: output of
% the filter and W: a metrix containg the tractories of the estimated
% coefficients.
N = length(x);
w = zeros(M,1);
y = zeros(1,N);
e =zeros(1,N);
W =zeros(M,N-1);
u = zeros(1,M);
for k = M:N-1
u=[x(k),u(1:M-1)];
y(k) = u*w;
e(k)=x(k+1)-y(k);
w = w+mu*u'*e(k);
W(:,k) = w(:,1); % stores all the estimated weights in W
% after every run of the for loop
end
end
tic
clc; close all
clear
tic
Lx = [-1,3]; % Maximum abs limit for the x-axis of contour plots
Ly = [-3,1]; % Maximum abs limit for the y-axis of contour plots
N = 501; % Signal size (also number of iterations)
Niter = 1000; % Ensemble size
mua = 0.04; % Large Step-size
mum = 0.01; % Small Step-size
% Generate AR(2) process x(n) = [-0.95 0.9]
a1 = -0.950;
a2 = 0.9;
varx = 1;
r0 = varx;
varw = ((1-a2)*((1+a2)^2-a1^2)/(1+a2))*varx;
r1 = -a1/(1+a2)*r0;
r2 = (-a2+a1^2/(1+a2))*r0;
lam1 = (1-a1/(1+a2))*varx;
lam2 = (1+a1/(1+a2))*varx;
XR = lam1/lam2;
d = [r1;r2];
R = toeplitz([r0,r1,r2]); % Autocorrelation matrix
Ru = toeplitz([r0 r1]);
Rdu = d;
% Ensemble averaging for large mu = 0.04
w1 = zeros(1,N);
w2 = zeros(1,N);
w_avg1 = zeros(1,N-1);
w_avg2 = w_avg1;
J_avg = zeros(1,N);
% Ensemble averaging for small mu = 0.01
wm1 = zeros(1,N);
wm2 = zeros(1,N);
wm_avg1 = zeros(1,N-1);
wm_avg2 = wm_avg1;
Jm_avg = zeros(1,N);
M = 2;
mu = [mua mum];
q = length(mu);
c = [1 a1 a2];
for i=1:Niter
w = sqrt(varw)*randn(N,1);
x = filter(1,c,w);
[e,w,y,W] = PreDLMS(x,mua,M);
w1 = W(1,:);
w2 = W(2,:);
w_avg1 = w_avg1+w1; % mu = 0.04
w_avg2 = w_avg2+w2;
J_avg = J_avg+abs(e.^2);
[em,wm,ym,Wm] = PreDLMS(x,mum,M);
wm1 = Wm(1,:);
wm2 = Wm(2,:);
wm_avg1 = wm_avg1+wm1; % mu = 0.01
wm_avg2 = wm_avg2+wm2;
Jm_avg = Jm_avg+abs(em.^2);
end
w_avg1 = w_avg1/Niter; % mu = 0.04
w_avg2 = w_avg2/Niter;
J_avg = J_avg/Niter;
wm_avg1 = wm_avg1/Niter; % mu = 0.01
wm_avg2 = wm_avg2/Niter;
Jm_avg = Jm_avg/Niter;
ind = 0:N-1;
indw = 1:N-1;
% Generate contour curves using the STD error function
L1 = 400;
Lw1 = Lx(1);
Lw2 = Lx(2);
delta = Lw2-Lw1;
step = delta/(L1-1);
alpha = Lw1:step:Lw2;
beta = -Lw2:step:-Lw1;
[alpha ,beta] = meshgrid(alpha,beta);
J = r0 - 2*(Ru(1,2)*alpha + Rdu(2)*beta) + 2*Ru(1,2)*alpha.*beta +...
Ru(1,1)*(alpha.^2 +beta.^2);
% Ploting some calculated results
figure()
subplot(221);
min_J = min(min(J));
max_J = max(max(J));
spacing = 0.05; % draw more circles between each interval for smaller values
topspace = 2; % increase the upper limit circles
levels=[min_J:spacing:1,1,1:topspace:max_J-18];
contour(alpha,beta,J,levels); hold on;
plot(w1,w2,'k','linewidth',1.5);
plot(w_avg1,w_avg2,'dr','linewidth',6)
plot(wm_avg1,wm_avg2,'+b','linewidth',1.5);
title('(a) Averaged Trajectory','fontsize',10);
xlabel('w_{1}','fontsize',14);
ylabel('w_{2}','fontsize',14);
colormap(cool)
colorbar;
axis([-1 3 -3 1])
legend('J(w1,w2)','(w1,w1), \mu = 0.04',...
'Ensamble(w1,w1),\mu = 0.04','Ensamble (w1,w1),\mu = 0.01');
grid;
%##### figure() % The learning curves J for both mua = 0.01 and mum = 004
subplot(222);
plot(ind,J_avg,'linewidth',2); hold on
plot(ind,Jm_avg,'linewidth',2);
plot([0,N-1],[varw,varw],'k--');
set(gca,'ytick',[0,varw,1],'fontsize',14);
grid
a =title('Ensamble-average of |e(i)|^2');
set(a,'fontsize',14);
a = xlabel('NO. OF ITERATIONS');
set(a,'fontsize',14);
a = ylabel('MSE dB');
set(a,'fontsize',14);
set(gca,'xtick',[0,N-1],'ytick',[0,varw,0.5,1],'fontsize',14);
a = legend('\mu=0.04','\mu=0.01');
set(a,'fontsize',14)
text(30,0.16,'\mu=0.04','fontsize',12);
text(50,0.7,'\mu=0.01','fontsize',12);
%##########figure() % w1 and w2 at mu = 0.04###############################
subplot(223);
plot(indw,w_avg1,'linewidth',2,'color','k'); hold on
plot(indw,w_avg2,'linewidth',2,'color','k');
plot(indw,w1,'linewidth',2,'color','m');
plot(indw,w2,'linewidth',2,'color','m');
plot([0,N-1],[-a2,-a2],'k--',[0,N-1],[-a1,-a1],'k--');
set(gca,'xtick',[0,250,N-1],'ytick',[-a2,0,-a1],'fontsize',14);
a = title('Estimated coefficients');
set(a,'fontsize',14);
a = xlabel('NO. OF ITERATIONS');
set(a,'fontsize',14);
a = ylabel('coef. trajectories');
set(a,'fontsize',14);
text(55, 0.4,'a_1(n)','fontsize',10);
text(55,-0.35,'a_2(n)','fontsize',10);
text(70,-0.15,' See how the solution converges to -0.9 and 0.95 as should be.');
grid
%##########figure % J_avd(w) and e(w)####################################
subplot(224);
plot(ind,J_avg,'linewidth',2,'color','r'); hold on
plot(ind,e,'linewidth',2,'color','b');
plot([0,N-1],[varw,varw],'k--');
set(gca,'xtick',[0,N-1],'ytick',[0,varw,0.5,1],'fontsize',10);
grid
a = legend('Ensamble-average |e(i)|^2','Single realization: e(i)',...
'\sigma_v = 0.1425');
set(a,'fontsize',14)
a =title('MSE-learning curves');
set(a,'fontsize',14);
a = xlabel('NO. OF ITERATIONS');
set(a,'fontsize',14);
a = ylabel('|e(w)|^2, e(w)');
axis([-1,N,0,1.1]);
set(a,'fontsize',14);
toc

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by