Logging needed Information while training a Reinforcement learning agent.

2 vues (au cours des 30 derniers jours)
Hello everyone
Currently I am using RL to develop an algorithm to optimize 3D movementof a UAV using certaint models and calculations. some of the data are used for reward and some are not but they are important for my research. I found that i could save those in loggedsignals, but as it gets bigger, it slows the training after a while, although Its useful for one episode or one simulation. Is there a better way to log or save these unused yet important data?
Thank you

Réponse acceptée

Ari Biswas
Ari Biswas le 1 Mar 2024
Unfortunately there is no straightforward way to do this currently but we may have a solution in the upcoming releases (stay tuned).
For now you can add the following code in your environment reset function. It will save the logged signals to disk and remove it from memory to improve performance.
currentDir = pwd;
env.ResetFcn = @(in) myResetFcn(in, currentDir);
function in = myResetFcn(in, currentDir)
% your reset code...
% create a unique name for each episode.
% For parallel training use uuid to avoid incorrect episode indices.
% s = matlab.lang.internal.uuid;
s = string(datetime("now"),"yyyyMMddhhmmss");
filename = fullfile(currentDir, "loggedData_" + s + ".mat");
% process the logged data post episode simulation
in = in.setPostSimFcn(@(x) myPostSim(x, filename));
end
function new_out = myPostSim(out, filename)
% save the logged data to disk
% "logsout" is the name specified in your model settings > Data Import/Export > Signal logging
loggedData = out.logsout;
save(filename, "loggedData");
% remove logged data from out
new_out = out;
new_out.logsout = [];
new_out.tout = []
end

Plus de réponses (0)

Catégories

En savoir plus sur Deep Learning with Simulink dans Help Center et File Exchange

Produits


Version

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by