Neural Network Simulink Block Library
The Deep Learning Toolbox™ product provides a set of blocks that you can use to build neural
networks using Simulink®, or that the function
gensim can use to generate
the Simulink version of any network you have created using MATLAB®.
To open the Deep Learning Toolbox block library, at the MATLAB Command Window, enter:
This command opens a library window that contains five blocks. Each of these blocks contains additional blocks.
Transfer Function Blocks
Double-click the Transfer Functions block in the Neural library window to open a window that contains several transfer function blocks.
Each of these blocks takes a net input vector and generates a corresponding output vector whose dimensions are the same as the input vector.
Net Input Blocks
Double-click the Net Input Functions block in the Neural library window to open a window that contains two net-input function blocks.
Each of these blocks takes any number of weighted input vectors, weight layer output vectors, and bias vectors, and returns a net-input vector.
Double-click the Weight Functions block in the Neural library window to open a window that contains three weight function blocks.
Each of these blocks takes a neuron weight vector and applies it to an input vector (or a layer output vector) to get a weighted input value for a neuron.
These blocks expect the neuron weight vector to be a column vector. This is because Simulink signals can be column vectors, but cannot be matrices or row vectors.
As a result of this limitation, to implement a weight matrix going to a layer with S neurons, you must create S weight function blocks (one for each row).
This contrasts with the other two kinds of blocks. Only one net input function and one transfer function block are required for each layer.
Double-click the Processing Functions block in the Neural library window to open a window that contains processing blocks and their corresponding reverse-processing blocks.
You can use each of these blocks to preprocess inputs and postprocess outputs.