Saving RL agent with hindsight experience replay

5 vues (au cours des 30 derniers jours)
Syed Adil Ahmed
Syed Adil Ahmed le 11 Oct 2024
Réponse apportée : Umar le 12 Oct 2024
Hi,
When I run a RL agent with hindsight experience replay, and then I use the save command at end to save the agent it does not save the agent as having an Experience Buffer as one with hindsight experience replay, but just as a standard buffer and only the length of the buffer is same as the hindsight experience replay buffer. During training, that is before saving, the agent variable does possess the hindsight experience replay buffer option.
This is the save command I use:
filename = sprintf('tD3_HER%d.mat', j);
save(filename,"agent","env","trainingStats")
The picture below shows the trained agent, where ExperienceBuffer is just the standard experience buffer.

Réponses (1)

Umar
Umar le 12 Oct 2024

Hi @Syed Adil Ahmed ,

After reviewing the documentation provided at the link below

https://www.mathworks.com/help/matlab/ref/save.html

& analysis of your comments, it seems that this issue likely stems from how MATLAB handles object properties and the save functionality. When you save an object in MATLAB, it captures the properties of the object at that moment. However, certain properties—especially those tied to dynamic features like custom buffers—may not be serialized properly if they are not explicitly defined as part of the object's state or if they are linked to specific training configurations that aren't retained post-training. Here is a structured approach to resolve this issue:

Check Object Properties: Ensure that the HER buffer is indeed a property of the agent object. You can do this by inspecting the agent's properties before saving it:

   disp(agent);

Explicitly Define HER Buffer: If the HER buffer is a dynamically created property (i.e., created during training but not defined in the constructor), you may need to modify your agent class to ensure that it includes HER as a property:

   properties
       ExperienceBuffer % Add this if it's not already present
   end

Custom Save Method: Implement a custom save method within your agent class that explicitly saves all necessary components of your agent's state, including the HER buffer:

   function saveAgent(obj, filename)
       save(filename, 'obj', 'otherNecessaryVariables');
   end

Use Structs for Complex Objects: If modifying the class is not feasible, consider converting your agent to a struct before saving. This allows you to control exactly what gets saved:

   agentStruct = struct(agent);
   save(filename, 'agentStruct');

Load and Verify: After implementing these changes, test loading the saved file and check if the HER buffer is restored correctly:

   loadedData = load(filename);
   disp(loadedData.agentStruct); % Check properties here

Hope this helps.

Please let me know if you have any further questions.

Catégories

En savoir plus sur Agents dans Help Center et File Exchange

Produits


Version

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by