How to add custom environment for Reinforcement learning toolbox?
7 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I want to make a 3d environment representing a neighborhood containing some blocks as the buildings. I want to use this model as the environment in the toolbox for the agent to interact with and find the shortest path. How should I do that? I'm having difficulties defining this using classes.
0 commentaires
Réponses (1)
Emmanouil Tzorakoleftherakis
le 27 Oct 2023
Why do you need a 3d world for this problem? Unless you consider the z dimension (e.g. if you do planning for UAVs), you only need a 2d env. Even if you were to do it for visualization, I wouldn't recommend training in the 3d world since it would only make training slower. I would start with a grid world or some occupancy grid that you can tailor to match the 3d world you have in mind:
2 commentaires
Emmanouil Tzorakoleftherakis
le 28 Oct 2023
You can also create 3d occupancy maps like this:
https://www.mathworks.com/help/uav/ug/generate-random-3-d-occupancy-map-for-uav-motion-planning.html
and turn them into RL training environments by following the example I shared earlier:
Voir également
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!