Open
Description
Currently skip_existing
operates on all keys without any granularity.
This is a problem in RL when in a loss module for example you may want to skip existing "values" but you definitely never want to skip existing "memory" in a memory based model (RNN). AKA if you use skip_existing on memory keys you will never update your memory.
This is needed to support rnns in torch rl (issue pytorch/rl#1060)
We need a solution to make skip_existing
more granular.
This is really simple and consists in feeding to the set_skip_existing
funtion the keys we actually want to skip.
with set_skip_existing(["value", "value_target]):
loss(td) # Will use existing values but not existing hidden memory
by default, if no keys are passed, the beahvior remains the same as the current set_skip_existing=True