8000 [Feature Request] key-level granularity in `skip_existing` · Issue #352 · pytorch/tensordict · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
[Feature Request] key-level granularity in skip_existing #352
Open
@matteobettini

Description

@matteobettini

Currently skip_existing operates on all keys without any granularity.

This is a problem in RL when in a loss module for example you may want to skip existing "values" but you definitely never want to skip existing "memory" in a memory based model (RNN). AKA if you use skip_existing on memory keys you will never update your memory.

This is needed to support rnns in torch rl (issue pytorch/rl#1060)

We need a solution to make skip_existing more granular.

This is really simple and consists in feeding to the set_skip_existing funtion the keys we actually want to skip.

with set_skip_existing(["value", "value_target]):
     loss(td) # Will use existing values but not existing hidden memory

by default, if no keys are passed, the beahvior remains the same as the current set_skip_existing=True

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions

    0