How to save trained Q-Network by RL-DQN?

4 views (last 30 days)
一馬 平田
一馬 平田 on 31 Oct 2021
Answered: Abhiram on 12 Jun 2025
I would like to load the trained Q-Network in rlQValueRepresetation.
How can I save the pre-trained Q-network.
I know that DQN agent can be saved with rlTrainingOptions. but I could not confirm pre-trained Q-network.
Due to my lack of confirmation, if it is possible to save pre-trained Q-Network in rlTrainingOptions, could you please tell me how to load the Q-Network?

Answers (1)

Abhiram
Abhiram on 12 Jun 2025
To save and load a trained Q-Network in rlQValueRepresentation, the Q-Network can be extracted from the agent and be saved as a MAT file. Code snippets for saving and loading a Q-Network are given:
% Extract Q-network from trained agent
qRep = getCritic(agent);
% Save the Q-network to a file
save('savedQNetwork.mat','qRep');
% Load the Q-network from file
load('savedQNetwork.mat','qRep');
% Rebuild agent from loaded Q-network (assuming agent options are available)
agentFromLoadedQ = rlDQNAgent(qRep, agentOpts);
For more information on the “save”, “load”, “rlDQNAgent” and “getCritic” functions, refer to the MATLAB Documentation:
Hope this helps!

Products


Release

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!