8000 feat(LangChainAgent): stream_response by maciejmajek · Pull Request #589 · RobotecAI/rai · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

feat(LangChainAgent): stream_response #589

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
May 16, 2025
Merged

Conversation

maciejmajek
Copy link
Member
@maciejmajek maciejmajek commented May 15, 2025

Purpose

When using LangChainAgent with llm configured without streaming=True, the output would be (silently) discarded.
This PR aims to improve the behavior with flawed configuration.

Proposed Changes

  • stream_response parameter
  • logging error on flawed configurations

Issues

  • Links to relevant issues

Testing

terminal 1

from rai import get_llm_model, AgentRunner
from rai.agents import ReActAgent
from rai.communication.ros2 import ROS2HRIConnector, ROS2HRIMessage
from rai.communication.ros2 import ROS2Context

@ROS2Context()
def main():
    connector = ROS2HRIConnector() # Initialize communication over ROS 2

    agent = ReActAgent(
        # configure output
        target_connectors={
            "/to_human": connector,
        },
        llm=get_llm_model("simple_model", streaming=True), # set to True and False
        stream_response=True,
    )
    # configure input
    agent.subscribe_source("/from_human", connector)

    runner = AgentRunner(agents=[agent])
    runner.run_and_wait_for_shutdown()

if __name__ == "__main__":
    main()

terminal 2

ros2 topic echo /to_human rai_interfaces/msg/HRIMessage

terminal 3

ros2 topic pub /from_human rai_interfaces/msg/HRIMessage "{\"text\": \"Move the arm to 0, 0, 0?\"}" --once

@maciejmajek maciejmajek requested a review from boczekbartek May 15, 2025 17:28
@boczekbartek
Copy link
Member

I've tested with the snipper on ubuntu 22.04 ~ ros2 humble and it worked.

@boczekbartek boczekbartek self-requested a review May 16, 2025 08:56
@maciejmajek maciejmajek merged commit 520d347 into development May 16, 2025
6 checks passed
@maciejmajek maciejmajek deleted the feat/stream_response branch May 16, 2025 12:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants
0