Hi,
I use Matlab mostly to handle and analyze large environmental datasets. Recently I've been trying to reduce matrices with NaNs in them to a subset. This would be easy if a given row, matrix(i,:), was all NaNs, but only one of the elements in the third column of that row is a NaN. Given that, I'd like to reject that row, and establish a second matrix without that/those row(s). I'm trying the following at the moment:
reject=zeros(length(matrix1.(:,1)),1);
len=length(matrix1(:,1));
for j=1:len
reject(j,1)=isfinite(matrix1(j,3));
end
reject3=logical(horzcat(reject, reject, reject));
matrix2=matrix1(reject3);
In theory, this should scan the third column of matrix1 (where NaNs are possible), and establishe a logical of that column, then make a [lenx3] logical of that which matches matrix1 in dimensions. Matrix2, then, should be the reduced set. But when I do this, matrix2 is returned to me as a vector of the three columns combined.
Can you suggest a way to go about this, or a quick fix to my code that might make this possible? Feel free to point out any other problems in the code. I'm a musician-turned-inefficient Matlab user... I have a feeling this could be achieved in four simple lines if I knew what I was doing.
TIA, Jonathan