8000 e2e-nvidia-l40s-x4-llama.yml failure: vllm fails to start · Issue #3466 · instructlab/instructlab · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
e2e-nvidia-l40s-x4-llama.yml failure: vllm fails to start #3466
Open
@ktdreyer

Description

@ktdreyer

In this recent run for E2E (NVIDIA L40S x4) LLAMA, vllm fails to start:

vLLM failed to start.  Retry with --enable-serving-output to learn more about the failure.
Using unknown model family: meta-llama specified by user.

Metadata

Metadata

Assignees

No one assigned

    Labels

    CI/CDAffects CI/CD configurationbugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0