8000 FlattenLayer fix(?) -- top should always Share with bottom by jeffdonahue · Pull Request #3025 · BVLC/caffe · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

FlattenLayer fix(?) -- top should always Share with bottom #3025

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

jeffdonahue
Copy link
Contributor

I pulled this commit from #2033, where it is no longer necessary (the current version doesn't use FlattenLayer), but in some previous version of my RNN implementation, this was causing incorrect behavior. The current implementation does bottom[0]->ShareDiff(*top[0]), which means that if bottom[0] has its own diff SyncedMemory, it gets deleted, which in #2033 must have interacted with my sharing of input diffs in weird ways.

This changes the behavior to work the same way as ReshapeLayer. It also seems somewhat more natural as the bottom blobs really "belong" to some other layer (whichever one outputs them as tops), whereas a layer's tops sort of belong to it, so it seems less dangerous for a layer to delete the underlying SyncedMemory of its own tops. However, I don't know of any actual cases where this causes problems in current Caffe, so I'll leave it to others to decide whether this should be merged.

@longjon
Copy link
Contributor
longjon commented Sep 4, 2015

I haven't thought carefully (yet?) about whether the current behavior is okay, but I agree that this patch changes the code to what I would expect.

@ronghanghu
Copy link
Member

Except for the "ownership" issue, in my opinion a layer should never change its bottom during reshape and forward (and perhaps never change its top during backward). Ideally we should have const Blob<Dtype>* instead of Blob<Dtype>* for bottom blobs in Reshape and Forward.

I advocate for putting sharing in Reshape instead of Forward and Backward;

@longjon
Copy link
Contributor
longjon commented Sep 4, 2015

@ronghanghu yes; for details on why we don't (currently?) have const Blob* bottoms in forward, see #945.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants
0