Reinforcement Learning with Parallel Computing Query

8 views (last 30 days)
Hi All,
I am attempting to get parallel computing enabled when I train my RL agent in R2022a. Forgive the basic question regarding parallel computing,as this is my first attempt. My laptop is compatible with a HVIDIA GeForce RTX 3060. I seem to get an issue where the pool goes idle "IdleTimeout", and I had to "restart" pool on several occasions. I left it to run overnight and again it stalled and stopped training. I am not sure what is happening. Any help would be great. I have included some screen grabs below and the code in my RL script, GPU device properties, and the error showing the pool stopped. I did have episode manager open during the simulation, but the learning did not seem the sae as when I originally ran the training on the CPU. Plus in R2022a I am unable to stop training via episode manager.
Thanks in advance,
Patrick
trainingOpts.UseParallel = true;
trainingOpts.ParallelizationOptions.Mode = 'async';
trainingOpts.ParallelizationOptions.StepsUntilDataIsSent = 32;
trainingOpts.ParallelizationOptions.DataToSendFromWorkers = 'Experiences';
The "IdleTimeout" error is shown below:
Episode Manager:
  1 Comment
Joss Knight
Joss Knight on 27 Jul 2022
Can you try increasing your pool IdleTimeout? Maybe this tool is spending a long time computing on the client and meanwhile your pool is idle.

Sign in to comment.

Answers (2)

PB75
PB75 on 27 Jul 2022
Hi Joss,
Thanks for taking the time to answer my question. I have un-checked the idle timeout in preferences, however, I encounter this error when I run the script?
  2 Comments
Joss Knight
Joss Knight on 27 Jul 2022
Please try restarting MATLAB, also try converting your script to .m (using Save As) in case Live Script is causing the issue.
In the meantime, I am asking others for help.
Joss Knight
Joss Knight on 27 Jul 2022
By the way, it seems the Idle Timeout is occurring simply because your code errored during execution and stopped running. The error is displayed in the Live Script. Then the pool times out and the timeout message is displayed in the command window.

Sign in to comment.


PB75
PB75 on 27 Jul 2022
Hi Joss,
I have done as you recommended, seems the issue may still be there running the .m script also, as alongside the error the episode reward does not change during training, see below screen grab running the .m script.
  3 Comments
PB75
PB75 on 6 Sep 2022
Hi All,
Any joy an the above error running parallel computing in RL training?
Cheers
Joss Knight
Joss Knight on 10 Sep 2022
I think this is too hard to debug with this little information. There is some sort of error happening during training and probably we would need to inspect the logs to determine what happened. Please open a support case.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!