Putting symbolic array into matrix
Afficher commentaires plus anciens
Hello,
I am trying to take a symbolic variable and subtract it from every value in an array (of some 1xn size) and then solve for this unknown variable later, but I am having trouble seemingly getting the symbolic array to nest correctly.
Here is an example of what I am trying to do,
syms q0
Vy=[300 500 300]
q(1,:)=[q0-(Vy.*2.*4./8)]
After this, q(i,:) will be in a for loop that will use this symbolic variable.
I keep getting an error and I dont know how to modify this in a way where I can still solve for q0 later on.
Thanks for the help!!!
Réponse acceptée
Plus de réponses (0)
Catégories
En savoir plus sur Logical dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
