π¨ Dragβandβdrop Workflow Builder.
π Thousands of AI models through Huggingface Hub.
π 100% Open Source β’ PrivacyβFirst β’ SelfβHosted
Local-first by default | Open source & auditable | π€ Huggingface Integration |
---|---|---|
All model inference runs on your machine. No data leaves unless you decide. | NodeTool is 100% open source under AGPL β audit every line of code. | Need more power? Use the GPU cloud node β and only for the data you select. |
NodeTool is the Swiss-Army Knife for AI builders. Unlike code-first stacks, NodeTool gives you every AI tool in one visual workspace. Connect your models, streamline your workflow, and turn ideas into reality.
- π Dragβandβdrop multimodal workflows
- π§ LLMs, image, audio, and video nodes
- ποΈ Asset and vector store integration
- π€ Agent orchestration
- β‘ AI cloud scaling
- π‘ Built-in templates and node packs
Drag any model into your canvasβLLMs, diffusion, agents, or custom code. Connect with one click and watch your AI workflow come alive.
Keep data local until you need power. Scale to Huggingface, Anthropic, OpenAI, Gemini, Fal.ai or Replicate.
Access and trigger AI workflows through a unified chat interface.
Build intelligent agents that coordinate multiple AI models. Chain reasoning, planning, and execution across complex multi-step workflows.
Built-in ChromaDB means your AI remembers everything. Create smart assistants that know your documents.
Import, organize, and manage all your media assets in one place. No more hunting through folders.
Additional Features:
- Multimodal Capabilities: Process text, images, audio, and video within a single workflow
- System Tray Integration: Access workflows quickly via the system tray with global shortcuts
- Ready-to-Use Templates: Start quickly with pre-built workflow templates
- Mini-App Builder: Convert workflows into standalone desktop applications
- API Access: Integrate with external applications and services
- Custom Python Extensions: Extend functionality with custom Python scripts
- Cross-Platform: Build and run on Mac, Windows, and Linux
Keep your data on-device until you need Huggingface Inference API. Unlike cloud-only tools, NodeTool protects your privacy while giving you access to the world's largest AI model collection. Trust through transparencyβevery line of code is open source.
- Process sensitive data locally
- Run LLMs on your hardware
- Zero data transmission by default
- Burst to GPU cloud in seconds
- Connect OpenAI, Anthropic, Fal, Replicate, Gemini
- Control exactly what data gets shared
All critical execution happens inside the local process unless a cloud node is explicitly connected:
- Local LLMs via Ollama
- Local Huggingface models with CUDA and MPS acceleration
- Chroma Vector database for RAG
- User-friendly Chat Interface
- Powerful AI Workflow builder
- Bring your own API Keys if needed (Huggingface, Anthropic, OpenAI, Gemini, Fal.ai, Replicate)
β AGPLβ3.0 License β zero lock-in β’ β Privacy-first Desktop App β’ β Fully self-hostable with no telemetry
- Windows / Mac / Linux: Download the installer here
- Launch the installer and run NodeTool.
Note: Requires an Nvidia GPU or Apple Silicon (M1+) and at least 20GB of free disk space for model downloads.
- Open NodeTool.
- Choose a prebuilt template or start with a blank canvas.
- Drag and drop AI nodes and connect them visually.
- Click Run and watch your local AI workflow execute!
Design sophisticated AI agents capable of handling complex, multi-step tasks using NodeTool's agent framework.
Core Capabilities:
- Strategic Task Planning: Automatically break down complex objectives into structured, executable plans.
- Chain of Thought Reasoning: Enable agents to perform step-by-step problem solving with explicit reasoning paths.
- Tool Integration: Equip agents with tools for web browsing, file operations, API calls, and more.
- Streaming Results: Get live updates as agents reason and execute tasks.
NodeTool includes several pre-built agent examples:
- Wikipedia-Style Research Agent: Generates structured documentation via web research.
- ChromaDB Research Agent: Processes and indexes documents for semantic querying.
- Social Media Analysis Agents: Tracks and analyzes content from Twitter/X, Instagram, and Reddit.
- Professional Research Tools: Analyzes the LinkedIn job market and performs advanced Google searches.
- Utility Agents: Processes emails and integrates web search capabilities.
Find full implementations and more examples in the examples directory of the nodetool-core
repository.
From simple automations to complex multi-agent systems:
Create AI that knows your documents, emails, and notes. Keep everything private on your machine.
Turn repetitive work into smart workflows. Let AI handle the routine while you focus on creating.
From text to images to music β create anything with AI. Combine models for unique results.
Upscale, enhance, and transform visual content. Professional results with consumer hardware.
Transcribe, analyze, and generate speech. Build voice-first applications that actually work.
Turn spreadsheets into insights. Create charts, find patterns, and make decisions faster.
Chain LLMs with diffusion models. Create workflows that no single AI can handle alone.
From desktop shortcuts to web APIs. Your workflows run where your users are. (Coming soon)
Add custom Python nodes when you need them. The visual canvas grows with your expertise.
See exactly what your AI is thinking. Debug workflows with clear visual feedback.
Export workflows as code or templates. Build on what others have created. (Coming soon)
Connect with other NodeTool users and the development team:
- π Star us on GitHub: github.com/nodetool-ai/nodetool
- π¬ Join the Discussion: Discord Community
- π Contribute: Help shape the future of local-first AI. See Contributing below.
Let's build amazing AI workflows together! β¨
Release 0.6 is in pre-release.
Follow these steps to set up a local development environment for the entire NodeTool platform, including the UI, backend services, and the core library (nodetool-core
). If you are primarily interested in contributing to the core library itself, please also refer to the nodetool-core repository for its specific development setup using Poetry.
- Python 3.11: Required for the backend.
- Conda: Download and install from miniconda.org.
- Node.js (Latest LTS): Required for the frontend. Download and install from nodejs.org.
# Create and activate the Conda environment
conda create -n nodetool python=3.11 -y
conda activate nodetool
# Install essential system dependencies via Conda
conda install -c conda-forge ffmpeg cairo x264 x265 aom libopus libvorbis lame pandoc uv -y
These are the essential packages to run NodeTool.
Make sure to activate the conda environment.
# Install nodetool-core and nodetool-base
# On macOS / Linux / Windows:
uv pip install git+https://github.com/nodetool-ai/nodetool-core
uv pip install git+https://github.com/nodetool-ai/nodetool-base
NodeTool's functionality is extended via packs. Install only the ones you need.
NOTE:
- Activate the conda environment first
- Always use uv as we define extra index for pytorch
# List available packs (optional)
nodetool package list -a
# Example: Install packs for specific integrations
uv pip install git+https://github.com/nodetool-ai/nodetool-huggingface
uv pip install git+https://github.com/nodetool-ai/nodetool-fal
uv pip install git+https://github.com/nodetool-ai/nodetool-replicate
uv pip install git+https://github.com/nodetool-ai/nodetool-elevenlabs
Note: Some packs like nodetool-huggingface
may require specific PyTorch versions or CUDA drivers. Use the --extra-index-url
when necessary.
Ensure the nodetool
Conda environment is active.
Option A: Run Backend with Web UI (for Development)
This command starts the backend server:
# On macOS / Linux / Windows:
nodetool serve --reload
Run frontend in web folder:
npm start
Access the UI in your browser at http://localhost:3000
.
Option B: Run with Electron App
This provides the full desktop application experience.
Configure Conda Path:
Ensure your settings.yaml
file points to your Conda environment path:
- macOS/Linux:
~/.config/nodetool/settings.yaml
- Windows:
%APPDATA%/nodetool/settings.yaml
CONDA_ENV: /path/to/your/conda/envs/nodetool # e.g., /Users/me/miniconda3/envs/nodetool
Build Frontends: You only need to do this once or when frontend code changes.
# Build the main web UI
cd web
npm install
npm run build
cd ..
# Build the apps UI (if needed)
cd apps
npm install
npm run build
cd ..
# Build the Electron UI
cd electron
npm install
npm run build
cd ..
Start Electron:
cd electron
npm start # launches the desktop app using the previously built UI
The Electron app will launch, automatically starting the backend and frontend.
We welcome community contributions!
- Fork the repository on GitHub.
- Create a new branch for your feature (
git checkout -b feature/your-feature-name
). - Make your changes and commit them (
git commit -am 'Add some amazing feature'
). - Push your branch to your fork (
git push origin feature/your-feature-name
). - Open a Pull Request against the
main
branch of the original repository.
Please follow our contribution guidelines and code of conduct.
AGPL
We'd love to hear from you! Whether you have questions, suggestions, or feedback, feel free to reach out through any of the following channels:
- NodeTool Platform Repository: github.com/nodetool-ai/nodetool
- NodeTool Core Library Repository: github.com/nodetool-ai/nodetool-core
- Email: hello@nodetool.ai
- Discord Community: Join us on Discord
- Community Forum: Visit the NodeTool Forum
- GitHub Issues: Report issues or request features
- Project Leads: Matthias Georgi (matti@nodetool.ai), David BΓΌhrer (david@nodetool.ai)
We're excited to collaborate and build amazing AI workflows together! πβ¨
Extend NodeTool's capabilities with specialized Node Packs. The NodeTool Packs Registry manages discovery, installation, and distribution.
Manage packs easily through the NodeTool UI:
- Browse available packs.
- Install, uninstall, and update packs (uses
pip
behind the scenes). - View pack details and documentation.
Alternatively, install directly via pip
(see Development Setup).
Refer to the NodeTool Registry repository for detailed guidelines on creating and publishing packs.
The documentation site is built with Jekyll on GitHub Pages. Markdown files live in the docs/
directory and changes on main
are deployed automatically. Start with the Getting Started guide and browse our new Tips and Tricks section for handy workflow shortcuts.