8000 Releases · sandipan1/hica · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Releases: sandipan1/hica

HICA v0.2.0: Memory Lane & Workflow Control

14 Jul 15:52
d68254d
Compare
Choose a tag to compare

What’s New?

Build Workflows Your Way

You’re no longer stuck with a one-size-fits-all agent loop. Now you can:

  • Mix and match LLM calls, parameter generation, and tool execution however you like.
  • Take full control: run things step by step, or let the agent do its thing.
  • Orchestrate complex tasks, branch logic, and handle results just the way you want.

Memory That Grows With You

We’ve reimagined memory in HICA:

  • Store conversations, prompts, configs, or anything else—using in-memory, file, SQLite, or MongoDB.
  • All memory types work the same way, so you can swap them in and out as your project grows.
  • Perfect for both quick experiments and production deployments.

Real-World Examples & Better Docs

  • Check out new examples like workflow.py for flexible orchestration, and large_context_only.py for handling big chunks of context.
  • More tests, clearer docs, and a smoother developer experience.

Built for Production (and for You)

  • Everything is composable, testable, and ready for real-world use.
  • Human-in-the-loop, event-sourced, and fully auditable—so you can trust what your agent is doing.

Thanks for being part of the HICA journey.
We can’t wait to see what you build with these new tools!

What's Changed

  • Refactor memory management and update README by @sandipan1 in #2
  • feat: Unify memory management, add flexible agent APIs, expand examples and tests by @sandipan1 in #5

Full Changelog: v1.0.1...v1.2.0

Hica v1.0.1 - little cleanups

24 Jun 04:48
Compare
Choose a tag to compare

🚀 HICA v1.0.0 — First Public Release!

23 Jun 15:16
8000
Compare
Choose a tag to compare

🚀 HICA v1.0.0 — First Public Release!

We’re excited to announce the first public release of HICA: Highly Customizable Agent Library — the fast, Pythonic way to build transparent, controllable, and production-ready AI agents.


🌟 Key Features

🧠 Transparent, Observable Agent Workflows

  • Event-Sourced Architecture: Every action (user input, LLM call, tool call, tool response, clarification) is recorded as an Event in a persistent Thread for full traceability and auditability.
  • Real-Time Event Streaming: The agent loop is an async generator, yielding each event as it happens for instant logging, UI updates, or streaming to clients.
  • Structured Logging: All agent actions and tool calls are logged for easy debugging and compliance.

🛠️ Unified Tool Management

  • Composable Tools: Register both local Python functions and remote MCP tools in a unified registry.
  • MCPConnectionManager: Seamlessly connect to MCP servers, fetch available tools, and execute remote tool calls.
  • Automatic Tool Registration: Tool properties (name, description, parameters) are extracted from function signatures and docstrings, or loaded from MCP schemas.
  • Type Safety: All tool calls are validated against Pydantic models generated from tool schemas, ensuring robust parameter validation.

🔄 Stateful, Persistent, and Resumable

  • Thread & ThreadStore: All conversation state is managed in a Thread and persisted via ThreadStore (file-based or DB-based).
  • Resumable Sessions: Pause and resume agent workflows at any time, even after a system restart.
  • Metadata Support: Attach arbitrary metadata to threads and events for advanced workflow tracking.

👤 Human-in-the-Loop (HITL) by Design

  • Clarification Requests: Agents can pause and request clarification from a human, logging these as first-class events.
  • Seamless Resumption: Resume workflows with new user input at any time.

⚡ FastAPI & Streamlit Integrations

  • Production-Ready API: Example FastAPI servers for local and MCP-enabled agent workflows.
  • Interactive Web UI: Streamlit apps for real-time chat, event log visualization, and human-in-the-loop workflows.

🧩 Customizability & Extensibility

  • Decoupled Workflow Steps: Tool selection, parameter generation, and execution are modular and overrideable.
  • Pluggable LLM Providers: Use any async-compatible LLM backend by configuring AgentConfig.
  • Flexible Metadata & State: Easily extend agent, thread, or event metadata for your domain.

🧪 Testing & Reliability

  • Comprehensive Test Suite: Pytest-based tests for all core features.
  • Robust Error Handling: Clear error messages and safe fallback behaviors.

📦 Example Gallery

  • Basic Agent: basic_agent.py
  • MCP Tools: mcp_agent.py
  • File Tools: file_tools.py
  • Async Agent Loop: async_agent_loop.py
  • Human-in-the-Loop: human_in_loop_example.py
  • Web UI: streamlit_app.py

💡 Why HICA?

  • Production-Ready: Designed for reliability, auditability, and extensibility.
  • Unified Tooling: Mix and match local Python and remote MCP tools.
  • Transparent: Every step is logged and persisted for debugging and compliance.
  • Human-in-the-Loop: Agents can pause for user input or approval at any time.
  • Open Source: Community-driven and vendor-neutral.

🚀 Get Started

pip install hica

or for all optional dependencies:

pip install 'hica[all]'

Requires Python 3.12+.


I can’t wait to see what you build with HICA!
Questions or feedback? Open an issue or join the discussion.


Full Changelog: v0.3.0...v1.0.0

Real-Time Event Streaming, Efficient Polling, and Improved Agent Visibility

22 Jun 14:02
Compare
Choose a tag to compare

Release Note:

This release brings major improvements to agent workflow transparency, real-time interactivity, and developer experience:

✨ ### Highlights
Async Generator for Real-Time Event Streaming:
Agents now yield intermediate thread states as they occur, enabling real-time monitoring, streaming UIs, and step-by-step persistence.

Efficient Event Polling:
New API endpoint allows clients to fetch only new events since the last update, reducing bandwidth and improving responsiveness.

Streamlit App Real-Time Updates:
The chat UI now auto-refreshes and polls for new events while a job is running, stopping automatically when the job is complete.

UI/UX Enhancements:
Improved intent badge visibility and overall chat readability.

Robust Session State Handling:
More reliable event tracking and thread resumption in the frontend.

🚀 Release v0.2.0 – The Interactive Intelligence Update!

22 Jun 06:19
Compare
Choose a tag to compare

This release for hica is substantial transforming it from a simple library into a sophisticated, interactive, and transparent agentic framework.
We’ve focused on making the agent’s operations visible and controllable in real time.

✨ Key Features & Enhancements

🖥️ Live, Interactive Web UI
• The new flagship example is a full-fledged web application powered by FastAPI and Streamlit.
• The UI streams the agent’s operations live, giving you real-time visibility into its actions.
• ✅ No more waiting for a final answer!

🧠 Transparent Reasoning – See What the Agent is Thinking
• The agent now records its reasoning behind tool selections.
• These justifications are captured and displayed in the UI.
• Gain insight into why decisions are made, not just what’s returned.

🗨️ Full Conversation Resumability
• The agent can now pause and ask for help when it needs clarification.
• Conversations stop at key moments, and you can interactively resume by providing input.
• Fully supported in the new web UI!

🧩 Dynamic Tool Management
• The ToolRegistry is now dynamic!
• Use add_tool() and remove_tool() at runtime.
• Enables fine-grained control over the agent’s capabilities throughout its lifecycle.

🔌 Provider-Agnostic LLM Support
• Switched from hardcoded OpenAI client to instructor.from_provider.
• Easily integrate different LLM providers.
• More flexibility and control for your agentic stack.

👀 It Has Eyes! (Image Support)
• Tools can now generate and return images.
• The Streamlit UI renders them inline, enabling more visual and powerful interactions.

🛠️ Architectural Improvements

🧹 Major Code Refactoring
• Core hica library is now cleanly separated from examples.
• All server and UI logic lives under the examples/ folder.
• Makes the core library leaner, cleaner, and more focused.

📦 Optional Dependencies
• Dependencies like streamlit and requests are now optional.
• Install them when needed via:

pip install hica[examples]

•	The core package is now lightweight by default.

🔮 Summary

This release marks a significant leap forward in:
• Usability
• Transparency
• Flexibility

We can’t wait to see what you build with these new powerful features!

👉 Check out the GitHub repo and get started!
💬 Feedback and contributions are always welcome.

Full Changelog: v0.1.0...v0.2.0

v0.1.0 — Initial PyPI Release

20 Jun 17:35
Compare
Choose a tag to compare
added github workflow for pypi

v0.2 — MCP Integration Release

19 Jun 22:14
e976777
Compare
Choose a tag to compare

MCP Integration

🚀 Major Features

  • MCP Tool Integration:
    Seamlessly register and invoke tools from remote MCP servers (e.g., FastMCP, SQLite, custom tool servers) alongside local Python tools.
  • Connection Manager:
    Added MCPConnectionManager for robust, reusable, and context-free MCP client connections—no more manual context management required.
  • Unified Tool Registry:
    The ToolRegistry now provides a single interface for both local and MCP tools, with automatic serialization and logging.
  • Advanced Serialization:
    All tool results (including FastMCP content types) are normalized for safe storage, logging, and downstream processing.
  • Agent Orchestration:
    Agents can now reason about, select, and execute both local and MCP tools in a single workflow, with full event traceability.
  • Improved Event Model:
    The Event model now supports lists and robust serialization, eliminating Pydantic warnings and improving compatibility.

🛠️ Improvements

  • Enhanced logging for tool execution (local vs. MCP).
  • Clarified and documented agent, registry, and connection manager usage patterns.
  • Example scripts and tests updated for new MCP workflow.

⚠️ Breaking Changes

  • The Event model now expects all tool results to be serialized before assignment.
  • MCP tools must be loaded via the connection manager and registry before agent execution.

📚 Documentation

See the updated README and examples for how to use MCP tools with hica agents.

What's Changed

New Contributors

Full Changelog: v0.1...v0.2

v0.1

17 Jun 16:31
ba0a776
Compare
Choose a tag to compare

Initial Release

HICA (Highly Customisable Agents ) v0.1 marks our first stable release, providing a robust foundation for building AI agents that work alongside humans.

✨ Key Features

Core Agent System

  • Autonomous agent with LLM integration
  • Two-stage LLM processing (tool selection & parameter filling)
  • Configurable system prompts

Human-in-the-Loop Capabilities

  • Approval workflows for sensitive operations
  • Clarification requests when input is ambiguous
  • Human response handling
  • Tool execution approval system

Tool Management

  • Tool registry system
  • Dynamic tool parameter handling
  • Built-in calculator tools as examples
  • Easy tool registration and execution

State Management

  • Thread-based conversation tracking
  • Event-based state management
  • In-memory thread storage
  • Extensible storage backend

API & Server

  • FastAPI integration
  • RESTful endpoints for thread management
  • Response and approval handling
    ###Logging & Monitoring
  • Structured logging with structlog
  • Event tracking
  • Debug and info level logging
  • JSON-formatted logs
0