Releases: rockbite/localforge
Releases · rockbite/localforge
Localforge v1.0.24
- Compressing and image uploads now work for the osx and windows builds
- Various small bugs fixed
- Support for MCP commands was added via std. For example for Intelij MCP Server
- BatchTool was failing to pass data to some other tools
- Tool calling now shows its description text even after it finished running
- Fixed various scenarios where AbortSignal was not working to stop an operation
Localforge v1.0.23
- Instead of hardcoded 3001 now runs on any available port (will try 3826 as default first as least common, then find other options if still busy)
- Added "Compress" button on top of each chat, that will use Expert model in order to summarize a long conversation history, to save up tokens as it blows up.
- Added support for MCP servers, now you can add many MCP servers, and use them in chats to add more tools
- Additional fixes to the token counter, now it updates realtime even as agentic loop keeps going
Localforge v1.0.22
What’s Changed
Added
view
tool: Now skips massive binaries, but it will also be able to "look into" images- macOS packaging: Builds are now universal (x64 + arm64).
- Background typing in the text editor while the model is busy (sending is still disabled until the model is free).
Fixed
- Message Prompt Editor → Block tab: Send button now functions correctly.
- Token counter: Accurate counts; removed erratic jumps, added new claude and gemini pricing.
- Socket uploads: Large PNG attachments no longer crash the socket connection.
Localforge v1.0.20
- Multiple bug fixes of the day
Localforge v1.0.19
Added
- Support for Qwen3 using openai base url, and some post processing fro and <tool_call> tags
- Fixed bugs with the auto updater
- Modal dialogs now close with ESC press
Localforge v1.0.17
Added
- Finally added CI
- Fixes for OSX related issues when running from DMG
- Fixes for Windows related issues
Localforge v1.0.14
Added
- Chat Context Visibility – system prompts now visible in message context.
- Theme System – complete CSS structure refactor with support for multiple themes.
- Light/Dark Mode – toggle between light and dark themes.
- New Themes – Caramel Latte & Dark Coffee themes added.
- Agent Flavours – create custom agent configurations with system prompt, model and tool overrides per chat.
- Auto-Update Feature – automatic updates for npm installations.
Fixed
- Resolved issue with simultaneous tool calls and content responses.
Changed
- CSS Structure – total refactor for improved maintainability and theme support.
Localforge v1.0.11
Added
- Prompt Editor – edit prompts as drag-n-drop blocks (foundation for a future system-prompt library).
- New LLM providers: Anthropic (Claude), Google Gemini, Google Vertex AI, and Ollama.
- Custom base-URL support for all providers, bringing the roster to:
OpenAI, Azure OpenAI, DeepSeek, Groq, Anyscale, Fireworks, Together, Mistral, Perplexity, OpenRouter, Gemini, Vertex AI, Claude, Ollama. - Settings dialog 2.0
- Split “Web Fetch” options into its own tab.
- The models tab now shows provider types.
- Full CRUD for providers (create, edit, delete).
- Three independent model-preset slots (aux, main, expert) – each can point to any provider.
Changed
- LLM middleware refactor – nuked the spaghetti; now lean, readable, and provider-agnostic.
Fixed
- First user message is no longer accidentally duplicated in LLM conversations.
Removed
- Dead “shitcode” purged from middleware.
Localforge v1.0.10
- Fixed critical bugs such as tool calling not working with Access Denied issue
- Changed model prompts to avoid confusing with user/agent roles as well as confusion about replace/edit tool uses
- Removed not workign options from settings dialog temporarily
- Added resizability option to message box
- Fixed system prompt to better follow user instructions
- Agent links now open in new window
- Bashs cripts are now preventable and stoppable