8000 final_state_tuple is a Tuple by vidurj · Pull Request #2645 · allenai/allennlp · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
This repository was archived by the owner on Dec 16, 2022. It is now read-only.

final_state_tuple is a Tuple #2645

Merged
merged 7 commits into from
Mar 26, 2019
Merged

final_state_tuple is a Tuple #2645

merged 7 commits into from
Mar 26, 2019

Conversation

vidurj
Copy link
Contributor
@vidurj vidurj commented Mar 25, 2019

Fixes #2503. The final_state_tuple in the StackedAlternatingLstm was a generator instead of a tuple.

@vidurj vidurj requested a review from brendan-ai2 March 25, 2019 20:21
Copy link
Contributor
@brendan-ai2 brendan-ai2 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@@ -106,5 +106,5 @@ def forward(self, # pylint: disable=arguments-differ
output_sequence, final_state = layer(output_sequence, state)
final_states.append(final_state)

final_state_tuple = (torch.cat(state_list, 0) for state_list in zip(*final_states))
final_state_tuple = tuple(torch.cat(state_list, 0) for state_list in zip(*final_states))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please add a return type on line 74 and correct the Returns section of the docstring?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, thanks.


def test_wrapper_works_with_alternating_lstm(self):
model = PytorchSeq2VecWrapper(
StackedAlternatingLstm(input_size=100,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need for the test fixture to be this size, it makes them slow - just use 4,5 etc.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point! Thanks.

@vidurj vidurj merged commit 305bd35 into master Mar 26, 2019
@vidurj vidurj deleted the alternating-lstm branch March 26, 2019 01:50
reiyw pushed a commit to reiyw/allennlp that referenced this pull request Nov 12, 2019
Fixes allenai#2503. The `final_state_tuple` in the `StackedAlternatingLstm` was a generator instead of a tuple.
TalSchuster pushed a commit to TalSchuster/allennlp-MultiLang that referenced this pull request Feb 20, 2020
Fixes allenai#2503. The `final_state_tuple` in the `StackedAlternatingLstm` was a generator instead of a tuple.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants
0