Load data into experience buffer: DDPG agent
Afficher commentaires plus anciens
I am using RL toolbox version 1.1 with Matlab R2019b and using the DDPG agent to design a controller. Is there a way to load in data (state, action, reward, next state) collected from real experiments into the experience buffer before startting training?
Réponses (2)
JiaZheng Yan
le 31 Mar 2020
1 vote
I find a way to show the Memory of the experience buffer.
You can open the file "ExperienceBuffer.m", which is in "...\Matlab\toolbox\rl\rl\+rl\+util".
In this file, you can the property value of the variable Memory. For example:

Then you set:
agentOpts.SaveExperienceBufferWithAgent = true;
agentOpts.ResetExperienceBufferBeforeTraining = false;
After your training, you can get the data in ''agent.ExperienceBuffer.Memory''
This also means that you can modify and use the training data.
I hope this method works for you : )
8 commentaires
Ao Liu
le 3 Juil 2020
你好,我现在用的2020a的工具箱也遇到了这个问题,按照你的这样改,好像不行,请问有什么解决方法吗!感谢!

JiaZheng Yan
le 4 Juil 2020

我尝试了一下,Memory变量的属性设置在所示的两个位置均可实现其显示及调用。
记得在创建agent的训练选项预设的代码中,加入以下代码:
agentOpts.SaveExperienceBufferWithAgent = true; %(这一句是关键)
agentOpts.ResetExperienceBufferBeforeTraining = false;
训练完成后,你应该就可以看见Memory元素。


Memory的长度可能会影响数据显示,通过抽样赋值可以看见更详细的数据:
a=agent.ExperienceBuffer.Memory{1}%取一个元素查看

分别表示 (state, action, reward, next state,is_done)
保存这个agent,在下次的训练时加载agent,就可以实现agent的反复训练
Fabian Hart
le 21 Juil 2020
Thanks for your answer!
Unfortunately I have problems to write on Matlab system files. When I try to do that the following message appears:
"Error writing ExperienceBuffer.m .... Acces denied"
Could you please tell me how you managed that? (Windows 10)
JiaZheng Yan
le 23 Juil 2020
Sorry, I hope you can provide a more detailed description or a screenshot of the error report, because I have never made such an error.
(I guess it's a file path problem)
zhou jianhao
le 25 Oct 2021
Hi, Jiazheng.
Your method is working, thanks a lot!
Still one thing bother, is there any method that I can access the memory buffer during training, I mean if I want to use prioritized experience replay.
Hope to hear from you soon.
Thanks!
Regards!
zhou jianhao
le 25 Oct 2021
I have to say temperally the RL toolbox in matlab is easy to use but hard to obtain satisfactory performance, so far from python platform.
zimeng Wang
le 1 Avr 2022
您好,我可以用这个方法查看buffer里的数据,但是如何修改或者删除buffer里的数据?
Arman Ali
le 1 Août 2022
have you found the answer? if yes please guide?
Priyanshu Mishra
le 26 Fév 2020
0 votes
Hi Daksh,
Catégories
En savoir plus sur Programming dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!