photo

Mark Hudson Beale


MathWorks

Actif depuis 2011

Followers: 0   Following: 0

Message

Statistiques

All
  • Knowledgeable Level 3
  • Revival Level 3
  • Knowledgeable Level 2
  • First Answer
  • Solver

Afficher les badges

Feeds

Afficher par

Réponse apportée
GPU computing doesn't work with timedelaynet, inputDelays != 0:0
Currently the GPU implementation of training does not parallelize for single series. If you have a long series and can break it...

presque 9 ans il y a | 0

Réponse apportée
How can I extract the values of weights and biases after each training epoch?
Greg is right, the function to get weights outside of a training function is getwb. Within a training function it is slightly...

plus de 9 ans il y a | 0

Réponse apportée
GPU training of neural network with parallel computing toolbox unreasonably slow, what am I missing?
Getting a speed up with a GPU requires a couple things: 1) The amount of time spent in gradient calculations (which happen on...

plus de 9 ans il y a | 0

| A accepté

Réponse apportée
When using GPU with neural net, I run out of shared memory per block; is there a way to handle?
I was able to reproduce your error. In MATLAB 13a the nndata2gpu array transformation is no longer required and if gpuArray is u...

plus de 11 ans il y a | 0

A résolu


Column Removal
Remove the nth column from input matrix A and return the resulting matrix in output B. So if A = [1 2 3; 4 5 6]; and ...

plus de 12 ans il y a

A résolu


Add two numbers
Given a and b, return the sum a+b in c.

plus de 12 ans il y a

A résolu


Find the sum of all the numbers of the input vector
Find the sum of all the numbers of the input vector x. Examples: Input x = [1 2 3 5] Output y is 11 Input x ...

plus de 12 ans il y a

A résolu


Check if sorted
Check if sorted. Example: Input x = [1 2 0] Output y is 0

plus de 12 ans il y a

A résolu


Is my wife right?
Regardless of input, output the string 'yes'.

plus de 12 ans il y a

Réponse apportée
Delays in the Neural Network Toolbox
Yes you are correct. The way you have set delays the network at each time step will be responsive to the current fuzzy time of ...

plus de 12 ans il y a | 0

| A accepté

Réponse apportée
Architecture of the neural network by nftool?
You have correctly understood how the main part of the neural network works, however, the inputs and outputs of the neural netwo...

plus de 12 ans il y a | 0

| A accepté

Réponse apportée
Disable Spacebar Command for Neural Network
You can turn off the training window (and therefore not have a stop button to accidentally trigger) with this command before cal...

environ 13 ans il y a | 0

Réponse apportée
Combine parallel toolbox and neural network toolbox
Currently parallel computing can be used to train multiple networks on different MATLAB workers at the same time. This speeds u...

environ 13 ans il y a | 0

Réponse apportée
Neural Nets for Classification
Yes, PATTERNNET is recommended for classification problems. TRAINLM is a good training function for most problems. For small...

environ 13 ans il y a | 0

| A accepté

Réponse apportée
Knowing the Weights in Matlab
The biases for each layer i are net.b{i}. So for a two layer network the biases are net.b{1} and net.b{2}. The weights to la...

environ 13 ans il y a | 4

| A accepté

Réponse apportée
Crossvalidation of Neural Networks
I am not sure I understand your question but perhaps this will help. If you want to divide your data set into a design set an...

environ 13 ans il y a | 0

Réponse apportée
NARX perform additional tests on network: from GUI to code
To apply the network to new data after training do the following: inputSeries2 = { ... your new input series ... }; [inp...

environ 13 ans il y a | 1

Réponse apportée
Unable to load trained network, maybe versionconflict
Unfortunately the R2010a version of the toolbox cannot support neural networks created with R2010b or later versions of the tool...

environ 13 ans il y a | 0

Réponse apportée
Neural Network - Multi Step Ahead Prediction
Here is an example that may help. A NARX network is trained on series inputs X and targets T, then the simulation is picked up ...

environ 13 ans il y a | 2

Réponse apportée
How to forecast with Neural Network?
You can convert the NARXNET from open-loop to closed-loop form to predict ahead any number of timesteps for which you have data ...

plus de 13 ans il y a | 4

Réponse apportée
Different results by backpropagation algorithm using different MatLab versions (2008 and 2010)
First, I assume you are setting the random seed before running this code to try and reproduce exact results? Otherwise, every r...

plus de 13 ans il y a | 0

| A accepté

Réponse apportée
Self Organizing Maps
The prototype pattern for each neuron is its weight vector. To see all the neurons' weight vectors: net.IW Each row rep...

plus de 13 ans il y a | 0

| A accepté

Réponse apportée
how to use weights and thresholds from Neural Network toolbox
You need to first preprocess inputs, then post process outputs as follows: xx = [0:0.5:5] for i=1:length(net.inputs{1}.p...

plus de 13 ans il y a | 0

Réponse apportée
How do I display the connection weights after each epoch for a perceptron network using MATLAB code?
The function PLOTWB displays weights and biases graphically. plotwb(net) You can attach this function to any network obj...

plus de 13 ans il y a | 0

| A accepté

Réponse apportée
Change preprocessing parameter in neural network
You can try this workaround: net = struct(net); net.inputs{1}.processParams{2}.ymin = 0.1; net = network(net);

plus de 13 ans il y a | 0

| A accepté

Réponse apportée
Plant model
The function GENSIM converts a neural network once it has been trained in MATLAB into an equivalent Simulink block diagram.

plus de 13 ans il y a | 1

| A accepté

Réponse apportée
Neural network performance function, weighted sse, and false alarms
Error weights can help you set which targets are most important to get correct, or equivalently, more costly to get wrong. Le...

plus de 13 ans il y a | 1

Réponse apportée
How to use NNTOOL
If the input has 300 hundred elements, and the hidden layer has 50 neurons, then each of the 300 neurons will have a connection ...

plus de 13 ans il y a | 0