8000 Error when decoder has more than 1 layer. · Issue #312 · asyml/texar-pytorch · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
Error when decoder has more than 1 layer. #312
Open
@pajola

Description

@pajola

The output is the follwoing:
RuntimeError: Input batch size 128 doesn't match hidden[0] batch size 256

The issue is due to the "initial_state=lstm_states" when the decoder is forwarded.

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requestedtopic: examplesIssue about examples

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0