8000 Activation layers inherit rank from source by dguest · Pull Request #99 · lwtnn/lwtnn · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Activation layers inherit rank from source #99

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Oct 4, 2019

Conversation

dguest
Copy link
Collaborator
@dguest dguest commented Oct 3, 2019

Previously the converter scripts would assume that any "activation"
layer in Keras was a feed-forward layer. This broke in cases where the
activation was operating on a sequence.

This commit makes all activation functions inherit their source node's
rank: i.e. if the source is a sequence, the activation function will
be time-distributed, if the source is feed-forward, the activation
will be feed-forward as well.

Previously the converter scripts would assume that any "activation"
layer in Keras was a feed-forward layer. This broke in cases where the
activation was operating on a sequence.

This commit makes all activation functions inherit their source node's
rank: i.e. if the source is a sequence, the activation function will
be time-distributed, if the source is feed-forward, the activation
will be feed-forward as well.
@dguest dguest merged commit 26e4742 into lwtnn:master Oct 4, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant
0