From idea to running code in minutes, not weeks. LocalCloud delivers developer-friendly PostgreSQL, MongoDB, vector databases, AI models, Redis cache, job queues, and S3-like storage instantly. No DevOps, no cloud bills, no infrastructure drama.
🌐 Programming Language Agnostic - Works seamlessly with Python, Node.js, Go, Java, Rust, PHP, .NET, or any language. LocalCloud provides standard service APIs (PostgreSQL, MongoDB, Redis, S3, etc.) that integrate with your existing code regardless of technology stack.
- 💸 Bootstrapped Startups - Build MVPs with zero infrastructure costs during early development
- 🔒 Privacy-First Enterprises - Run open-source AI models locally, keeping data in-house
- ⏰ Corporate Developers - Skip IT approval queues - get PostgreSQL and Redis running now
- 📱 Demo Heroes - Tunnel your app to any device - present from iPhone to client's iPad instantly
- 🤝 Remote Teams - Share running environments with frontend developers without deployment hassles
- 🎓 Students & Learners - Master databases and AI without complex setup or cloud accounts
- 🧪 Testing Pipelines - Integrate AI and databases in CI without external dependencies
- 🔧 Prototype Speed - Spin up full-stack environments faster than booting a VM
- 🤖 AI Assistant Users - Works seamlessly with Claude Code, Cursor, Gemini CLI for AI-powered development
Choose your platform for one-command installation:
curl -fsSL https://localcloud.sh/install | bash
# Install
iwr -useb https://localcloud.sh/install.ps1 | iex
# Update/Reinstall
iwr -useb https://localcloud.sh/install.ps1 | iex -ArgumentList "-Force"
# Homebrew (macOS/Linux)
brew install localcloud-sh/tap/localcloud
# Coming Soon
# winget install localcloud # Windows Package Manager
# choco install localcloud # Chocolatey
# scoop install localcloud # Scoop
# apt install localcloud # Debian/Ubuntu
# dnf install localcloud # Fedora
# pacman -S localcloud # Arch Linux
📋 Alternative Installation Methods
Windows (PowerShell)
# Install (https://localcloud.sh/install.ps1)
iwr -useb https://localcloud.sh/install.ps1 | iex
# Update/Reinstall
iwr -useb https://localcloud.sh/install.ps1 | iex -ArgumentList "-Force"
Manual Download:
- Download latest release from GitHub Releases
- Extract the archive for your platform
- Move binary to PATH directory
Development Build:
git clone https://github.com/localcloud-sh/localcloud
cd localcloud
go build -o localcloud ./cmd/localcloud
# Setup your project with an interactive wizard
lc setup
You'll see an interactive wizard:
? What would you like to build? (Use arrow keys)
❯ Chat Assistant - Conversational AI with memory
RAG System - Document Q&A with vector search
Custom - Select components manually
? Select components you need: (Press <space> to select, <enter> to confirm)
❯ ◯ [AI] LLM (Text generation) - Large language models for text generation, chat, and completion
◯ [AI] Embeddings (Semantic search) - Text embeddings for semantic search and similarity
◯ [Database] Database (PostgreSQL) - Standard relational database for data storage
◯ [Database] Vector Search (pgvector) - Add vector similarity search to PostgreSQL
◯ [Database] NoSQL Database (MongoDB) - Document-oriented database for flexible data storage
◯ [Infrastructure] Cache (Redis) - In-memory cache for temporary data and sessions
◯ [Infrastructure] Queue (Redis) - Reliable job queue for background processing
◯ [Infrastructure] Object Storage (MinIO) - S3-compatible object storage for files and media
Then start your services:
lc start
# Your infrastructure is now running!
# Check status: lc status
AI assistants can set up projects with simple commands:
# Quick presets for common stacks
lc setup my-ai-app --preset=ai-dev --yes # AI + Database + Vector search
lc setup my-app --preset=full-stack --yes # Everything included
lc setup blog --preset=minimal --yes # Just AI models
# Or specify exact components
lc setup my-app --components=llm,database,storage --models=llama3.2:3b --yes
Note:
lc
is the short alias forlocalcloud
- use whichever you prefer!
- 🚀 One-Command Setup: Create and configure projects with just
lc setup
- 💰 Zero Cloud Costs: Everything runs locally - no API fees or usage limits
- 🔒 Complete Privacy: Your data never leaves your machine
- 📦 Pre-built Templates: Production-ready backends for common AI use cases
- 🧠 Optimized Models: Carefully selected models that run on 4GB RAM
- 🔧 Developer Friendly: Simple CLI, clear errors, extensible architecture
- 🐳 Docker-Based: Consistent environment across all platforms
- 🌐 Mobile Ready: Built-in tunnel support for demos anywhere
- 📤 Export Tools: One-command migration to any cloud provider
- 🤖 AI Assistant Ready: Non-interactive setup perfect for Claude Code, Cursor, Gemini CLI
Make production infrastructure as simple as running a local web server.
LocalCloud eliminates the complexity and cost of infrastructure setup by providing a complete, local-first development environment. No cloud bills, no data privacy concerns, no complex configurations - just pure development productivity.
For AI coding assistants: Share this repository link to give your AI assistant complete context:
"I'm using LocalCloud for local AI development. Please review this repository to understand its capabilities: https://github.com/localcloud-sh/localcloud"
Your AI assistant will automatically understand all commands and help you build applications using LocalCloud's non-interactive setup options.
Waiting 3 weeks for cloud access approval? Your POC could be done by then. LocalCloud lets you build and demonstrate AI solutions immediately, no IT tickets required.
Present from your phone to any client's screen. Built-in tunneling means you can demo your AI app from anywhere - coffee shop WiFi, client office, or conference room.
We've all been there - spun up a demo, showed the client, forgot to tear it down. With LocalCloud, closing your laptop is shutting down the infrastructure.
Healthcare, finance, government? Some data can't leave the building. LocalCloud keeps everything local while giving you cloud-level capabilities.
No API rate limits. No usage caps. No waiting for credits. Just pure development speed when every minute counts.
Your Own Cursor/Copilot: Build an AI code editor without $10k/month in API costs during development.
AI Mobile Apps: Develop and test your AI-powered iOS/Android app locally until you're ready to monetize.
SaaS MVP: Validate your AI startup idea without cloud bills - switch to cloud only after getting paying customers.
For Employers: Give candidates a pre-configured LocalCloud environment. No setup headaches, just coding skills evaluation.
For Candidates: Submit a fully-working AI application. While others struggle with API keys, you ship a complete solution.
AI Customer Support Trainer: Process your support tickets locally to train a custom assistant.
Code Review Bot: Build a team-specific code reviewer without sending code to external APIs.
Meeting Transcription System: Record, transcribe, and summarize meetings - all on company hardware.
"Hey Claude, build me a chatbot backend" → Your AI assistant runs lc setup my-chatbot --preset=ai-dev --yes
and in 60 seconds you have PostgreSQL, vector search, AI models, and Redis running locally. Complete with database schema, API endpoints, and a working chat interface. By the time you finish your coffee, you're making API calls to your fully functional backend.
No cloud signup. No credit card. No infrastructure drama. Just pure AI-assisted development velocity.
During lc setup
, you can choose from pre-configured templates or customize your own service selection:
lc setup my-assistant # Select "Chat Assistant" template
- Conversational AI with persistent memory
- PostgreSQL for conversation storage
- Streaming responses
- Model switching support
lc setup my-knowledge-base # Select "RAG System" template
- Document ingestion and embedding
- Vector search with pgvector
- Context-aware responses
- Scalable to millions of documents
lc setup my-transcriber # Select "Speech/Whisper" template
- Audio transcription API
- Multiple language support
- Real-time processing
- Optimized Whisper models
lc setup my-project # Choose "Custom" and select individual services
- Pick only the services you need
- Configure each service individually
- Optimal resource usage
Note: MVP includes backend infrastructure only. Frontend applications coming in v2.
LocalCloud Project Structure:
├── .localcloud/ # Project configuration
│ └── config.yaml # Service configurations and runtime settings
├── .gitignore # Git ignore file (excludes .localcloud from version control)
└── your-app/ # Your application code goes here
- Setup:
lc setup [project-name]
creates project and opens interactive wizard- Creates project structure (if new)
- Choose template or custom services
- Select AI models
- Configure ports and resources
- Start:
lc start
launches all services - Develop: Services are ready for your application
Service | Description | Default Port |
---|---|---|
AI/LLM | Ollama with selected models | 11434 |
Database | PostgreSQL (optional pgvector extension) | 5432 |
MongoDB | Document-oriented NoSQL database | 27017 |
Cache | Redis for performance | 6379 |
Queue | Redis for job processing | 6380 |
Storage | MinIO (S3-compatible) | 9000/9001 |
- OS: macOS, Linux, Windows 10/11
- RAM: 4GB minimum (8GB recommended)
- Disk: 10GB free space
- Docker: Docker Desktop or Docker Engine
- CPU: x64 or ARM64 processor
Note: LocalCloud is written in Go for performance, but you don't need Go installed. The CLI is a single binary that works everywhere. Windows users can install via PowerShell - no WSL required.
Windows:
# Check if update is needed (will show current version)
iwr -useb https://localcloud.sh/install.ps1 | iex
# Force update/reinstall
iwr -useb https://localcloud.sh/install.ps1 | iex -ArgumentList "-Force"
macOS/Linux (Homebrew):
brew upgrade localcloud-sh/tap/localcloud
Linux (without Homebrew):
# Re-run install script
curl -fsSL https://localcloud.sh/install | bash
# Create and configure new project
lc setup [project-name]
# Configure existing project (in current directory)
lc setup
# Add/remove components
lc setup --add llm
lc setup --add vector # Add vector search to existing database
lc setup --remove cache
lc setup --remove vector # Remove vector search, keep database
# Start all services
lc start
# Stop all services
lc stop
# View service status
lc status
# View logs
lc logs [service]