8000 [Feature Advice] Detect chat template forced <think> tag and add it back to API response(LM Studio UI too) · Issue #261 · lmstudio-ai/lms · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
[Feature Advice] Detect chat template forced <think> tag and add it back to API response(LM Studio UI too) #261
Open
@Originalimoc

Description

@Originalimoc

Some models' chat template looks like this:

......
{% if add_generation_prompt %}<|assistant|>
<think>{% endif %}

I'm hoping for this feature of after applying chat template, detect if last few chars are "<think>"(which can be set in "Reasoning Section Parsing")(+ last role is user to bypass if request is a role==assistant prefill) and process it in the LM Studio UI And send this "Start String"("<think>") back to API client in headless mode.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0