Info
Cette question est clôturée. Rouvrir pour modifier ou répondre.
Substituting Matrices in symbolic equations
1 vue (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Dear community,
I would like to ask for your help (if anyone has an idea) how to prevent the following miscalculation to happen:
Problem: Suppose I have two unknowns in a formula, where the dimensions of those are not known yet. Those two unkown variables get multiplied with each other, however when I substitute those two variables by matrices the " * " (multiplication symbol) will perform a elementwise multiplication. Is there anyway to force it to do a matrix multiplication. Attached you will find a little sample code.
The matrix multiplication should give: [2 1; 1 1] but instead I get back [1 1; 1 0]. The constraint of the problem is that in the beginning I have matrices inside a matrix.
Thanks in advance!
Adrian
clear all; close all; clc
syms Ku Kv
K = [Ku 0; 0 Kv];
Ku_mat= [1 1; 1 0];
Kv_mat = [1 1; 1 0];
z = det(K); % which gives back z = Ku*Kv
z=subs(z,{Ku,Kv},{Ku_mat, Kv_mat});
0 commentaires
Réponses (1)
madhan ravi
le 9 Juil 2020
Modifié(e) : madhan ravi
le 9 Juil 2020
At the moment it’s not possible using subs().
It would be better if you submit a feature enhancement request to the MathWorks.
A workaround:
z = det(K);
expre = regexprep(['@(',char(symvar(z)),')',char(z)], '\[|]','');
z1 = str2func(expre);
['Inputs must contain: ', char(symvar(z))]
Wanted = z1(Ku_mat,Kv_mat)
2 commentaires
Voir également
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!