The conjugate gradient method aims to solve a system of linear equations, Ax=b, where A is symmetric, without calculation of the inverse of A. It only requires a very small amount of membory, hence is particularly suitable for large scale systems.
It is faster than other approach such as Gaussian elimination if A is well-conditioned. For example,
A=U*diag(s+max(s))*U'; % to make A symmetric, well-contioned
Conjugate gradient is about two to three times faster than A\b, which uses the Gaissian elimination.
Yi Cao (2020). Conjugate Gradient Method (https://www.mathworks.com/matlabcentral/fileexchange/22494-conjugate-gradient-method), MATLAB Central File Exchange. Retrieved .
Dear Yi Cao,
Thanks very much for your sharing, but based on my test, the code is not totally correct as it produces wrong solution some times.
Please check my test:
x = rand(1000,5);
y = rand(1000,1);
len = length(y);
k = getkernel(x,sigma,c);
A = [0,ones(1,len);ones(len,1),k];
tic; params = conjgrad(A,B);toc;
%where the getkernel is defined as :
function [ matrix ] = getkernel(x,c,sigma)
matrix = exp((-bsxfun(@plus,(sum(x.^2, 2)),(sum(x.^2, 2))'-2*x*x'))/(2*sigma)) + eye(n)/c;
% The results go:
This is totally wrong.
I guess the matrix A in this test is not strictly positive definite. If this issue can be fixed, I guess it would be much more useful because this formulation is very useful for training the machine learning models/
Could anybody tell me how to evaluate the time required by the SVD. Please kindly note the svd.m called directly by the workspace is a built-in function, which is much faster than the svd.m available on the Matlab website. So which one is more suitable to evaluate the required time?
Actually, I am trying to compare a new algorithm with the SVD in computational cost or time. I suppose the built-in SVD function might be faster than the source-code SVD function. Could anybody help me? Thanks a lot for your kindly helps in advance! My email address is firstname.lastname@example.org.
A\b uses cholesky factorization if A is simmetric. It swithes to gauss method when A is not simmetric and the factorization fails due to negative sqrt input.
I don't know if it is present in all version, but there already is a conjgrad method in matlab.
Anyway thank to Cao, submissions that works are always usefull, in the worst case you can study the algorithm.
To consider two trival cases.
change initial value to x=b. slightly faster.