How can I set the initial value of action space while using Simulink DDPG Agent?

4 vues (au cours des 30 derniers jours)
lei
lei le 20 Nov 2024
Réponse apportée : Shlok le 29 Nov 2024
I got a robot model in Simulink, now I want to train the robot using DDPG Agent.
My question is, how can I set the initial value of action space? I want the action starts from some specefic value such as zero.

Réponses (1)

Shlok
Shlok le 29 Nov 2024
Hi Lei,
DDPG agents are designed to operate in continuous action spaces. Hence, to create a custom continuous action space for the DDPG agent, you can use the “rlNumericSpec” function. rlNumericSpec” helps in creating specifications object for a numeric action or observation channel. By setting the “LowerLimit” to zero, you can specify that the action starts from there.
Here is a sample code:
actionInfo = rlNumericSpec([1,1], 'LowerLimit', 0, 'UpperLimit', 1);
observationInfo = ... // define observation specifications
agtInitOpts = ... // define agent options
agent = rlDDPGAgent(observationInfo,actionInfo,agtInitOpts);
To know more about “rlNumericSpec function, you can refer to the following MATLAB Answer and documentations:

Produits


Version

R2024b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by