Putting symbolic array into matrix

1 vue (au cours des 30 derniers jours)
Ramses Young
Ramses Young le 29 Avr 2022
Hello,
I am trying to take a symbolic variable and subtract it from every value in an array (of some 1xn size) and then solve for this unknown variable later, but I am having trouble seemingly getting the symbolic array to nest correctly.
Here is an example of what I am trying to do,
syms q0
Vy=[300 500 300]
q(1,:)=[q0-(Vy.*2.*4./8)]
After this, q(i,:) will be in a for loop that will use this symbolic variable.
I keep getting an error and I dont know how to modify this in a way where I can still solve for q0 later on.
Thanks for the help!!!

Réponse acceptée

Walter Roberson
Walter Roberson le 29 Avr 2022
That code works in the form you posted.
syms q0
Vy=[300 500 300]
Vy = 1×3
300 500 300
q(1,:)=[q0-(Vy.*2.*4./8)]
q = 
The most common mistake for something like that would have been to initialize q using zeros(), such as
q = zeros(5,3);
If you had initialized q to double precision, then the assigning into q(1,:) would fail because what you are assigning needs to contain a symbolic variable.
The cure for that would be to use something like
q = zeros(5, 3, 'sym');
for new enough versions of MATLAB, or
q = sym(zeros(5,3));
if your MATLAB is older than that.

Plus de réponses (0)

Catégories

En savoir plus sur Symbolic Math Toolbox dans Help Center et File Exchange

Produits


Version

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by