8000 Typo: Chapter 1 - How do Transformers work? 4.mdx · Issue #898 · huggingface/course · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Typo: Chapter 1 - How do Transformers work? 4.mdx #898

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
code4funn opened this issue Apr 26, 2025 · 0 comments
Open

Typo: Chapter 1 - How do Transformers work? 4.mdx #898

code4funn opened this issue Apr 26, 2025 · 0 comments

Comments

@code4funn
Copy link

There is a typo at the end of line 37 and line 38 should come before 48

Current State line 37:

  • January 2022: InstructGPT, a version of GPT-3 that was trained to follow instructions better
    This list is far from comprehensive, and is just meant to highlight a few of the different kinds of Transformer models. Broadly, they can be grouped into three categories:

Expected line 37

  • January 2022: InstructGPT, a version of GPT-3 that was trained to follow instructions better.

Expected line 48

This list is far from comprehensive, and is just meant to highlight a few of the different kinds of Transformer models. Broadly, they can be grouped into three categories:

  • GPT-like (also called auto-regressive Transformer models)
  • BERT-like (also called auto-encoding Transformer models)
  • T5-like (also called sequence-to-sequence Transformer models)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant
0