if we can use learn rate drop factor with adam optimizer in DDPG or not ?

2 views (last 30 days)
if we can use learn rate drop factor with adam optimizer in DDPG or not ?
to decay to learnig rate during steps...if it possible...the options only provides OptimizerParameters , which not contain learn rate drop factor

Answers (0)

Categories

Find more on Networks in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!