[Feature Request] key-level granularity in skip_existing
#352
Labels
enhancement
New feature or request
8000
skip_existing
#352
Currently
skip_existing
operates on all keys without any granularity.This is a problem in RL when in a loss module for example you may want to skip existing "values" but you definitely never want to skip existing "memory" in a memory based model (RNN). AKA if you use skip_existing on memory keys you will never update your memory.
This is needed to support rnns in torch rl (issue pytorch/rl#1060)
We need a solution to make
skip_existing
more granular.This is really simple and consists in feeding to the
set_skip_existing
funtion the keys we actually want to skip.by default, if no keys are passed, the beahvior remains the same as the current
set_skip_existing=True
The text was updated successfully, but these errors were encountered: