MATLAB Answers

How can I reduce extract features from a set of Matrices and vectors to be used in Machine Learning

3 views (last 30 days)
JUDITH NJOKU
JUDITH NJOKU on 30 Apr 2020
Answered: Mahesh Taparia on 13 May 2020
Hello Friends.
I have a task where I need to train a machine learning model to predict a set of outputs from multiple inputs. My inputs are 1000 iterations of a set of 3x 1 vectors, a set of 3x3 covariance matrices and a set of scalars, while my output is just a set of scalars. I cannot use regression learner app because these inputs need to have the same dimensions, any idea on how to unify them?

  0 Comments

Sign in to comment.

Answers (1)

Mahesh Taparia
Mahesh Taparia on 13 May 2020
Hi
You can vectorised the 3X3 matrix to 9X1, then append with the rest of the features to make the 'd' dimensional input data, (i.e dX1). If the dimension is too high, you can use Principal Component Analysis to reduce that. Now you can use this dataset into regression learner and train your model.

  0 Comments

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by