Multi agent reinforcement learning for gain tuning power electronics
Answers (1)
0 votes
Hi @Muhammad,
Upon reviewing the code, it appears that the specifications for observations and actions are created using rlNumericSpec, which is indeed a valid class for defining observation and action spaces. However, the error message indicates a mismatch in the expected input types. The issue may arise from the way the observationInfo and actionInfo are being passed to the rlSimulinkEnv function. To resolve this issue, ensure that the observationInfo and actionInfo are explicitly defined as cell arrays of rl.util.RLDataSpec objects. The current code already does this correctly, but it is essential to verify that the rlNumericSpec objects are indeed being recognized as rl.util.RLDataSpec objects.Here is a revised version of the code with additional checks:
% Load the Simulink model mdl = 'RL_two_agents.slx'; open_system(mdl);
% I/O specifications for agent A obsInfo1 = rlNumericSpec([3, 1]); obsInfo1.Name = 'observations1'; obsInfo1.Description = 'a, b, c'; % Add space after commas for clarity actInfo1 = rlNumericSpec([1, 1], 'LowerLimit', 1, 'UpperLimit', 20); actInfo1.Name = 'gain_1';
% I/O specifications for agent B obsInfo2 = rlNumericSpec([3, 1]); obsInfo2.Name = 'observations2'; obsInfo2.Description = 'a2, b2, c2'; % Add space after commas for clarity actInfo2 = rlNumericSpec([1, 1], 'LowerLimit', 0.01, 'UpperLimit', 1); actInfo2.Name = 'gain_2';
% Combine observation and action specs into cell arrays
observationInfo = {obsInfo1, obsInfo2}; % Cell array of observation info
actionInfo = {actInfo1, actInfo2}; % Cell array of action info
% Create the reinforcement learning environment
env = rlSimulinkEnv(mdl, ...
{'RL_two_agents/RL Agent1', 'RL_two_agents/RL Agent2'}, ...
observationInfo, actionInfo); % Explicitly pass as cell arrays
% Set the reset function env.ResetFcn = @(in)localResetFcn(in);
Make sure that the agent block paths are passed as a cell array in the rlSimulinkEnv function and verify that the observationInfo and actionInfo are indeed recognized as cell arrays of rl.util.RLDataSpec objects.
Hope this helps.
11 Comments
Hi @Muhammad,
Since you have not shared your code, I will address address the error you kept experiencing with the rlSimulinkEnv`function, let me break down this time the key areas of concern. The error message you received—“The value of agentBlock is invalid. Expected input to be one of these types: char, string suggests that the paths provided for your agent blocks may not be formatted correctly. This typically happens if they are passed as a cell array or another data structure that is not recognized as a character array or string. So, go back to your code and make sure that the agentBlocks argument in rlSimulinkEnv is passed correctly. If you have multiple agents, they should be passed as a string array or a character vector, not as a cell array. Here is how you can adjust your code:
agentBlocks = ["RL_two_agents/RL Agent1", "RL_two_agents/RL Agent2"]; % String array env = rlSimulinkEnv(mdl, agentBlocks, observationInfo, actionInfo);
While your observation and action specifications appear correct, it is prudent to double-check that they are indeed being recognized as rl.util.RLDataSpec objects. You can verify this by checking their class:
class(observationInfo{1}) % Should return 'rl.util.RLDataSpec'
class(actionInfo{1}) % Should return 'rl.util.RLDataSpec'Make sure that the Simulink model (`mdl`) is loaded correctly and contains the specified agent blocks. Check if there are any typos in the block names or paths. Use MATLAB's built-in functions like get_param() to confirm that your agent blocks exist in the model and are accessible.Here is again how your adjusted code snippet might look:
mdl = 'RL_two_agents.slx'; open_system(mdl);
% I/O specifications for agent A obsInfo1 = rlNumericSpec([3, 1]); obsInfo1.Name = 'observations1'; obsInfo1.Description = 'a, b, c';
actInfo1 = rlNumericSpec([1, 1], 'LowerLimit', 1, 'UpperLimit', 20); actInfo1.Name = 'gain_1';
% I/O specifications for agent B obsInfo2 = rlNumericSpec([3, 1]); obsInfo2.Name = 'observations2'; obsInfo2.Description = 'a2, b2, c2';
actInfo2 = rlNumericSpec([1, 1], 'LowerLimit', 0.01, 'UpperLimit', 1); actInfo2.Name = 'gain_2';
% Combine into cell arrays
observationInfo = {obsInfo1, obsInfo2};
actionInfo = {actInfo1, actInfo2}; % Create environment agentBlocks = ["RL_two_agents/RL Agent1", "RL_two_agents/RL Agent2"]; env = rlSimulinkEnv(mdl, agentBlocks, observationInfo, actionInfo);
% Set reset function env.ResetFcn = @(in)localResetFcn(in);
A piece of advice, consider adding try-catch blocks around your environment creation code to capture and handle errors more gracefully.
By ensuring that all parameters are correctly formatted and checking for typographical errors or misconfigurations within your Simulink model setup, you should be able to resolve this issue effectively.
If further errors arise after making these adjustments, feel free to share those messages for additional troubleshooting assistance.
Hi @Muhammad,
Glad to know that you are making progress. You mentioned that you resolved the initial error by using:
blks = mdl + ["/RL Agent1", "/RL Agent2"]; env = rlSimulinkEnv(mdl, blks, observationInfo, actionInfo);
This is a valid approach, but ensure that blks is indeed a string array. You can explicitly define it as follows:
blks = string(mdl + ["/RL Agent1", "/RL Agent2"]); % Ensure blks is a string array
Please still make sure that your observation and action specifications are correctly defined as rl.util.RLDataSpec objects. Here’s how you can define them:
% I/O specifications for agent A obsInfo1 = rlNumericSpec([3, 1]); obsInfo1.Name = 'observations1'; obsInfo1.Description = 'a, b, c'; actInfo1 = rlNumericSpec([1, 1], 'LowerLimit', 1, 'UpperLimit', 20); actInfo1.Name = 'gain_1';
% I/O specifications for agent B obsInfo2 = rlNumericSpec([3, 1]); obsInfo2.Name = 'observations2'; obsInfo2.Description = 'a2, b2, c2'; actInfo2 = rlNumericSpec([1, 1], 'LowerLimit', 0.01, 'UpperLimit', 1); actInfo2.Name = 'gain_2';
% Combine into cell arrays
observationInfo = {obsInfo1, obsInfo2};
actionInfo = {actInfo1, actInfo2};
With the corrected agent blocks and specifications, you can create the environment as follows:
mdl = 'RL_two_agents.slx'; open_system(mdl); % Ensure the model is open
% Create environment agentBlocks = string(mdl + ["/RL Agent1", "/RL Agent2"]); % Ensure this is a string array env = rlSimulinkEnv(mdl, agentBlocks, observationInfo, actionInfo);
% Set reset function env.ResetFcn = @(in)localResetFcn(in);
The error you encountered in the Reinforcement Learning Designer app—Items must be a 1-by-N cell array of character vectors or a string array—suggests that the input provided to the UI component is not in the expected format. To resolve this, ensure that when you add new agents, you are passing a string array or a cell array of character vectors. For example:
% Example of adding agents in the Reinforcement Learning Designer
agentNames = ["RL Agent1", "RL Agent2"]; % String array
% If using a cell array, it should be:
% agentNames = {'RL Agent1', 'RL Agent2'};
To enhance the robustness of your code, consider wrapping your environment creation in a try-catch block:
try
env = rlSimulinkEnv(mdl, agentBlocks, observationInfo,
actionInfo);
catch ME
disp('An error occurred while creating the environment:');
disp(ME.message);
end
If further errors arise, please provide the specific error messages for additional troubleshooting assistance.
Feel free to reach out if you have any more questions or need further clarification on any of the steps outlined above.
Hi @Muhammad,
This error typically arises when the Items property of a UI control (like a dropdown or list box) is not being set correctly. In MATLAB, the Items property must be defined as either a 1-by-N cell array of character vectors or as a string array. Here's how you can address this issue: Ensure that the data you're trying to assign to the Items property is formatted correctly. For example:
Cell Array
app.DropDown.Items = {'Item1', 'Item2', 'Item3'};
String Array
app.DropDown.Items = ["Item1", "Item2", "Item3"];
Verify the source of the data being assigned to Items. If it is being generated dynamically (e.g., from a simulation), print it out before assignment to confirm its structure. If you are using a function to generate this list, ensure that it returns the correct format. Here is an example of how to correctly set up a dropdown in your App:
function startupFcn(app)
% Example items for the dropdown
items = {'Agent1', 'Agent2', 'Agent3'};
app.DropDown.Items = items; % Assigning items correctly end
If you are setting Items within a callback function, make sure that it is being executed after all necessary variables are initialized and available.
Hope this helps.
Categories
Find more on Reinforcement Learning in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!