Load a pretrained neural network object in rlNeuralNetworkEnvironment
3 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Hi,
I want to train an RL MBPO Agent that samples from a model. The model is a trained DL object, trained in matlab. I am wondering how I can load its weights inside the env object. The examples for rlNeuralNetworkEnvironment can be used to define a network structure but I would like to add my weights to this?
Best Regards,
Vasu
0 commentaires
Réponses (1)
Emmanouil Tzorakoleftherakis
le 21 Déc 2023
Hi Vasu,
You can use a pretrained environment model with MBPO agent as follows:
1) Create a rlContinuousDeterministicTransitionFunction with the trained dlnet if it is deterministic or rlContinuousGaussianTransitionFunction if it is stochastic (mean heads and std heads).
2) After that, you need to create rlNeuralNetworkEnvironment with newly defined function from 1.
3) Create MBPO agent.
4) Set LearnRate = 0 in TransitionOptimizerOptions in rlMBPOAgentOptions to avoid updating the models during training.
Hope this helps
0 commentaires
Voir également
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!