8000 Allow redirecting ChatInterface mid-task with `adaptive` by ahuang11 · Pull Request #7941 · holoviz/panel · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Allow redirecting ChatInterface mid-task with adaptive #7941

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 23 commits into
base: main
Choose a base branch
from

Conversation

ahuang11
Copy link
Contributor
demo.mp4

Inspired by https://x.com/_catwu/status/1922352915076849743

Finally: Real-time steering. Send feedback to Claude Code while it's working, without waiting for completion. Claude incorporates your input immediately, adjusting its approach based on new requirements or clarifications.

To start building out that capability to use in Lumen, I added an adaptive param in ChatInterface which allows users to continue sending messages while the first callback is still generating.

import param
import panel as pn
import asyncio
pn.extension()

async def adaptive_callback(message, user, instance):
    # Simulate a long-running response
    for i in range(5):
        await asyncio.sleep(0.5)
        instance.send(f"Step {i+1}: Processing '{message}'...", respond=False)
    return f"Completed processing: {message}"

# Enable adaptive mode for real-time interruption
ci = pn.chat.ChatInterface(
    callback=adaptive_callback,
    adaptive=True,  # Allow interrupting responses
    placeholder_text="Send follow-ups anytime to interrupt and redirect!",
    callback_exception="verbose",
    sizing_mode="stretch_both"
)
ci.show()

@ahuang11 ahuang11 requested a review from philippjfr May 22, 2025 20:51
@ahuang11 ahuang11 changed the title Allow interrupting ChatInterface mid-task with adaptive Allow redirecting ChatInterface mid-task with adaptive May 22, 2025
Copy link
codecov bot commented Jun 16, 2025

Codecov Report

Attention: Patch coverage is 97.23502% with 6 lines in your changes missing coverage. Please review.

Project coverage is 87.08%. Comparing base (bb77cc2) to head (a92fe23).

Files with missing lines Patch % Lines
panel/chat/feed.py 90.32% 6 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #7941      +/-   ##
==========================================
+ Coverage   87.05%   87.08%   +0.03%     
==========================================
  Files         346      346              
  Lines       53464    53654     +190     
==========================================
+ Hits        46542    46725     +183     
- Misses       6922     6929       +7     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@ahuang11
Copy link
Contributor Author

In addition to adaptive, we should have a queue mode

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants
0