Load data into experience buffer: DDPG agent

I am using RL toolbox version 1.1 with Matlab R2019b and using the DDPG agent to design a controller. Is there a way to load in data (state, action, reward, next state) collected from real experiments into the experience buffer before startting training?

Réponses (2)

JiaZheng Yan
JiaZheng Yan le 31 Mar 2020

1 vote

I find a way to show the Memory of the experience buffer.
You can open the file "ExperienceBuffer.m", which is in "...\Matlab\toolbox\rl\rl\+rl\+util".
In this file, you can the property value of the variable Memory. For example:
Then you set:
agentOpts.SaveExperienceBufferWithAgent = true;
agentOpts.ResetExperienceBufferBeforeTraining = false;
After your training, you can get the data in ''agent.ExperienceBuffer.Memory''
This also means that you can modify and use the training data.
I hope this method works for you : )

8 commentaires

Ao Liu
Ao Liu le 3 Juil 2020
你好,我现在用的2020a的工具箱也遇到了这个问题,按照你的这样改,好像不行,请问有什么解决方法吗!感谢!
我尝试了一下,Memory变量的属性设置在所示的两个位置均可实现其显示及调用。
记得在创建agent的训练选项预设的代码中,加入以下代码:
agentOpts.SaveExperienceBufferWithAgent = true; %(这一句是关键)
agentOpts.ResetExperienceBufferBeforeTraining = false;
训练完成后,你应该就可以看见Memory元素。
Memory的长度可能会影响数据显示,通过抽样赋值可以看见更详细的数据:
a=agent.ExperienceBuffer.Memory{1}%取一个元素查看
分别表示 (state, action, reward, next state,is_done)
保存这个agent,在下次的训练时加载agent,就可以实现agent的反复训练
Fabian Hart
Fabian Hart le 21 Juil 2020
Thanks for your answer!
Unfortunately I have problems to write on Matlab system files. When I try to do that the following message appears:
"Error writing ExperienceBuffer.m .... Acces denied"
Could you please tell me how you managed that? (Windows 10)
JiaZheng Yan
JiaZheng Yan le 23 Juil 2020
Sorry, I hope you can provide a more detailed description or a screenshot of the error report, because I have never made such an error.
(I guess it's a file path problem)
zhou jianhao
zhou jianhao le 25 Oct 2021
Hi, Jiazheng.
Your method is working, thanks a lot!
Still one thing bother, is there any method that I can access the memory buffer during training, I mean if I want to use prioritized experience replay.
Hope to hear from you soon.
Thanks!
Regards!
zhou jianhao
zhou jianhao le 25 Oct 2021
I have to say temperally the RL toolbox in matlab is easy to use but hard to obtain satisfactory performance, so far from python platform.
zimeng Wang
zimeng Wang le 1 Avr 2022
您好,我可以用这个方法查看buffer里的数据,但是如何修改或者删除buffer里的数据?
Arman Ali
Arman Ali le 1 Août 2022
have you found the answer? if yes please guide?

Connectez-vous pour commenter.

Priyanshu Mishra
Priyanshu Mishra le 26 Fév 2020

0 votes

Hi Daksh,
You may find following link useful for your answer.

2 commentaires

Daksh Shukla
Daksh Shukla le 26 Fév 2020
Hello Priyanshu,
Thanks for your response.
However, the link does not exactly resolve the problem I am having. The link talks about running a lot of initial simulations and saving the agent with the experience buffer. But, what I would like to do is use data from "real experiments" and NOT simulations. I would like to add this data to the experience buffer or the replay memory to kick start the DDPG learning.
Based on all my reading and trying to access experience buffer in Matlab, it seems like experience buffer object is a hidden property and I cannot upload data to it directly from an external source.
I would really appreciate if you could let me know a direct way to upload data to the experience buffer, if there is one.
Francisco Serra
Francisco Serra le 20 Déc 2023
Any updates on this? @Daksh Shukla?

Connectez-vous pour commenter.

Catégories

En savoir plus sur Programming dans Centre d'aide et File Exchange

Produits

Version

R2019b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by