FlattenLayer fix(?) -- top should always Share with bottom #3025
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
I pulled this commit from #2033, where it is no longer necessary (the current version doesn't use
FlattenLayer
), but in some previous version of my RNN implementation, this was causing incorrect behavior. The current implementation doesbottom[0]->ShareDiff(*top[0])
, which means that ifbottom[0]
has its own diffSyncedMemory
, it gets deleted, which in #2033 must have interacted with my sharing of input diffs in weird ways.This changes the behavior to work the same way as
ReshapeLayer
. It also seems somewhat more natural as thebottom
blobs really "belong" to some other layer (whichever one outputs them astop
s), whereas a layer'stop
s sort of belong to it, so it seems less dangerous for a layer to delete the underlyingSyncedMemory
of its owntop
s. However, I don't know of any actual cases where this causes problems in current Caffe, so I'll leave it to others to decide whether this should be merged.