How to use neural networks to take a scalar as an input and gives a time series as an output

1 vue (au cours des 30 derniers jours)
Hello there everyone
I have this mind-bugging question about neural networks. we normally use equisampled neural networks (for instance: we give 5 features each of which has 100 samples and we get a feature as an output with 100 sample). How about we train a neural network that, for example, takes 2 features each of which has 5 samples and gives out a time series.
Suppose that we have a differential equation as follows:
dy/dt = at/b
where a and b are constants. we solve this equation five times for different values of a and b and get a time series each time (so we have 5 time series). how can I train a neural network in matlab that by entering an arbitrary a and b, get a time series? Thanks in advance for your time devoted to this question.

Réponses (0)

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Produits


Version

R2019b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by