MATLAB Answers

Error in custom reset function while trying to run Cart Pole example

12 views (last 30 days)
A MATLAB newbie here trying out the Reinforcement Learning Toolbox and running into a very basic error.
I'm trying to understand how to use the custom reset and step functions. When I try to run the Cart Pole example given in "Create MATLAB Environment Using Custom Functions" (, I get an error.
Please let me know if you need more information. Any help is appreciated; I'm stuck :(
This is my code -
ObservationInfo = rlNumericSpec([4 1]);
ObservationInfo.Name = 'CartPole States';
ObservationInfo.Description = 'x, dx, theta, dtheta';
ActionInfo = rlFiniteSetSpec([-10 10]);
ActionInfo.Name = 'CartPole Action';
env = rlFunctionEnv(ObservationInfo,ActionInfo,'myStepFunction','myResetFunction');
function [InitialObservation, LoggedSignal] = myResetFunction()
% Reset function to place custom cart-pole environment into a random
% initial state.
% Theta (randomize)
T0 = 2 * 0.05 * rand() - 0.05;
% Thetadot
Td0 = 0;
% X
X0 = 0;
% Xdot
Xd0 = 0;
% Return initial environment state variables as logged signals.
LoggedSignal.State = [X0;Xd0;T0;Td0];
InitialObservation = LoggedSignal.State;
function [NextObs,Reward,IsDone,LoggedSignals] = myStepFunction(Action,LoggedSignals)
% Custom step function to construct cart-pole environment for the function
% name case.
% This function applies the given action to the environment and evaluates
% the system dynamics for one simulation step.
% Define the environment constants.
% Acceleration due to gravity in m/s^2
Gravity = 9.8;
% Mass of the cart
CartMass = 1.0;
% Mass of the pole
PoleMass = 0.1;
% Half the length of the pole
HalfPoleLength = 0.5;
% Max force the input can apply
MaxForce = 10;
% Sample time
Ts = 0.02;
% Pole angle at which to fail the episode
AngleThreshold = 12 * pi/180;
% Cart distance at which to fail the episode
DisplacementThreshold = 2.4;
% Reward each time step the cart-pole is balanced
RewardForNotFalling = 1;
% Penalty when the cart-pole fails to balance
PenaltyForFalling = -10;
% Check if the given action is valid.
if ~ismember(Action,[-MaxForce MaxForce])
error('Action must be %g for going left and %g for going right.',...
Force = Action;
% Unpack the state vector from the logged signals.
State = LoggedSignals.State;
XDot = State(2);
Theta = State(3);
ThetaDot = State(4);
% Cache to avoid recomputation.
CosTheta = cos(Theta);
SinTheta = sin(Theta);
SystemMass = CartMass + PoleMass;
temp = (Force + PoleMass*HalfPoleLength*ThetaDot*ThetaDot*SinTheta)/SystemMass;
% Apply motion equations.
ThetaDotDot = (Gravity*SinTheta - CosTheta*temp) / ...
(HalfPoleLength*(4.0/3.0 - PoleMass*CosTheta*CosTheta/SystemMass));
XDotDot = temp - PoleMass*HalfPoleLength*ThetaDotDot*CosTheta/SystemMass;
% Perform Euler integration.
LoggedSignals.State = State + Ts.*[XDot;XDotDot;ThetaDot;ThetaDotDot];
% Transform state to observation.
NextObs = LoggedSignals.State;
% Check terminal condition.
X = NextObs(1);
Theta = NextObs(3);
IsDone = abs(X) > DisplacementThreshold || abs(Theta) > AngleThreshold;
% Get reward.
if ~IsDone
Reward = RewardForNotFalling;
Reward = PenaltyForFalling;
and I get the following error -
Error using rl.env.rlFunctionEnv/set.ResetFcn (line 141)
There was an error setting the custom 'reset' function.
Error in rl.env.rlFunctionEnv (line 70)
this.ResetFcn = ResetFcn;
Error in rlFunctionEnv (line 45)
env = rl.env.rlFunctionEnv(varargin{:});
Error in tmp (line 8)
env = rlFunctionEnv(ObservationInfo,ActionInfo,'myStepFunction','myResetFunction');
Caused by:
Error using nargin
Not a valid MATLAB file.


Sign in to comment.

Accepted Answer

Chidvi Modala
Chidvi Modala on 7 Aug 2020
From my understanding, you are trying to run the entire provided code in a single script. As mentioned in the link, the custom step and reset functions must be in your current working folder or on the MATLAB path. So, you might need to seperate the step and reset functions from the script and define them seperately in a new script. Technically, you will end up with 3 files.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!