Matlab 2024b crashes when trying to continue training an RL agent
13 views (last 30 days)
Show older comments
Hello together,
I have trained an TD3 agent in Matlab with Simulink environment. I saved the trained agent and experienceBuffer in a mat-file (it is around 9.8GB). If I load the agent and do the simulation, I can see how the agent is acting, so I think everything is fine so far.
simOpts = rlSimulationOptions(MaxSteps=ceil(RL.Tf/RL.Ts), StopOnError="on", SimulationStorageType='memory'); %file
experiences = sim(env,agent,simOpts);
Now I would like to continue training the agent and there I run into a problem.
trainingStats = trainWithEvolutionStrategy(agent,env,evsTrainingOpts);
%trainingStats = train(agent,env,trainOpts);
agent.AgentOptions.SaveExperienceBufferWithAgent = true;
folderName_for_save = folderName;
save(fullfile(folderName_for_save, 'trainedAgent.mat'), 'agent', '-v7.3');
save(fullfile(folderName_for_save, 'trainingStats.mat'), 'trainingStats', '-v7.3');
I do the training in parallel with 8 workers and it will not start again. I get this error:
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1813903/image.png)
Is it because I only have an AMD Ryzon 5000 with only 16GB DDR4 RAM and I would need 32GB RAM? Or is it something else?
Alex
1 Comment
Sumukh
on 26 Nov 2024
The exit status does seem to indicate that this is an out-of-memory issue.
Can you once try increasing the Java Heap Memory size at Preferences -> MATLAB -> General -> Java Heap Memory and see if this issue persists?
If it still crashes, please contact MathWorks Support for better assistance in resolving the crash:
Answers (1)
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!