LLM4S provides a simple, robust, and scalable framework for building LLM applications in Scala. While most LLM work is done in Python, we believe that Scala offers a fundamentally better foundation for building reliable, maintainable AI-powered applications.
Note: This is a work in progress project and is likely to change significantly over time.
- Type Safety: Catch errors at compile time rather than runtime
- Functional Programming: Immutable data structures and pure functions for more predictable code
- JVM Ecosystem: Access to a vast array of libraries and tools
- Concurrency: Better handling of concurrent operations with Scala's actor model
- Performance: JVM performance with functional programming elegance
- Containerized Workspace: Secure execution environment for LLM-driven operations
- Workspace Agent Interface: Standardized protocol for file operations and command execution
- Multi-Provider Support: Planned support for multiple LLM providers (OpenAI, Anthropic, etc.)
- llm4s: Main project - contains the core LLM4S framework
- shared: Shared code between main project and workspace runner
- workspaceRunner: Code that performs the requested actions on the workspace within the docker container
- samples: Usage examples demonstrating various features
To get started with the LLM4S project, check out this teaser talk presented by Kannupriya Kalra at the Bay Area Scala Conference. This recording is essential for understanding where we’re headed:
🎥 Teaser Talk: https://www.youtube.com/watch?v=SXybj2P3_DE&ab_channel=SalarRahmanian
- JDK 21+
- SBT
- Docker (for containerized workspace)
# For the default Scala version (3.3.3)
sbt compile
# For all supported Scala versions (2.13 and 3.3)
sbt +compile
# Build and test all versions
sbt buildAll
You will need an API key for either OpenAI (https://platform.openai.com/) or Anthropic (https://console.anthropic.com/) other LLMS may be supported in the future (see the backlog.
Set the environment variables:
LLM_MODEL=openai/gpt-4o
OPENAI_API_KEY=<your_openai_api_key>
or Anthropic:
LLM_MODEL=anthropic/claude-3-7-sonnet-latest
ANTHROPIC_API_KEY=<your_anthropic_api_key>
Thia will allow you to run the non-containerized examples.
# Using Scala 3.3.3
sbt "samples/runMain org.llm4s.samples.basic.BasicLLMCallingExample"
sbt docker:publishLocal
sbt "samples/runMain org.llm4s.samples.workspace.ContainerisedWorkspaceDemo"
# Using Scala 2.13
sbt ++2.13.14 "samples/runMain org.llm4s.samples.basic.BasicLLMCallingExample"
LLM4S supports both Scala 2.13 and Scala 3.3. The codebase is set up to handle version-specific code through source directories:
src/main/scala
- Common code for both Scala 2.13 and 3.3src/main/scala-2.13
- Scala 2.13 specific codesrc/main/scala-3
- Scala 3 specific code
When you need to use version-specific features, place the code in the appropriate directory.
We've added convenient aliases for cross-compilation:
# Compile for all Scala versions
sbt compileAll
# Test all Scala versions
sbt testAll
# Both compile and test
sbt buildAll
# Publish for all versions
sbt publishAll
We use specialized test projects to verify cross-version compatibility against the published artifacts. These tests ensure that the library works correctly across different Scala versions by testing against actual published JARs rather than local target directories.
# Run tests for both Scala 2 and 3 against published JARs
sbt testCross
# Full clean, publish, and test verification
sbt fullCrossTest
Note: For detailed information about our cross-testing strategy and setup, see crosstest/README.md
Our goal is to implement Scala equivalents of popular Python LLM frameworks:
- Single API access to multiple LLM providers (like LiteLLM)
- A comprehensive toolchain for building LLM apps (like LangChain/LangGraph)
- An agentic framework (like PydanticAI, CrewAI)
- Tokenization utilities (port of tiktoken)
- Full ReAct loop implementation
- Simple tool calling mechanism
Tool calling is a critical integration - we aim to make it as simple as possible:
Using ScalaMeta to automatically generate tool definitions from Scala methods:
/** My tool does some funky things with a & b...
* @param a The first thing
* @param b The second thing
*/
def myTool(a: Int, b: String): ToolResponse = {
// Implementation
}
ScalaMeta extracts method parameters, types, and documentation to generate OpenAI-compatible tool definitions.
Mapping LLM tool call requests to actual method invocations through:
- Code generation
- Reflection-based approaches
- ScalaMeta-based parameter mapping
Tools run in a protected Docker container environment to prevent accidental system damage or data leakage.
See the talks being given by maintainers and open source developers globally and witness the engagement by developers around the world.
Stay updated with talks, workshops, and presentations about LLM4S happening globally. These sessions dive into the architecture, features, and future plans of the project.
Date | Event/Conference | Talk Title | Location | Speaker Name | Details URL | Recording Link URL |
---|---|---|---|---|---|---|
25-Feb-2025 | Bay Area Scala | Let's Teach LLMs to Write Great Scala! | Tubi office, San Francisco, CA | Kannupriya Kalra | Event Info | Watch Recording |
20-Apr-2025 | Scala India | Let's Teach LLMs to Write Great Scala! | India | Kannupriya Kalra | Event Info | Watch Recording |
📝 Want to invite us for a talk or workshop? Reach out via our respective emails or connect on Discord: https://discord.gg/4uvTPn6qww
- Build AI-powered applications in a statically typed, functional language designed for large systems.
- Help shape the Scala ecosystem’s future in the AI/LLM space.
- Learn modern LLM techniques like zero-shot prompting, tool calling, and agentic workflows.
- Collaborate with experienced Scala engineers and open-source contributors.
- Gain real-world experience working with Dockerized environments and multi-LLM providers.
- Contribute to a Google Summer of Code (GSoC)-eligible project.
- Join a global developer community focused on type-safe, maintainable AI systems.
Interested in contributing? Start here:
LLM4S GitHub Issues: https://lnkd.in/eXrhwgWY
Want to be part of developing this and interact with other developers? Join our Discord community!
LLM4S Discord: https://lnkd.in/eb4ZFdtG
This project is also participating in Google Summer of Code (GSoC) 2025! If you're interested in contributing to the project as a contributor, check out the details here:
👉 Scala Center GSoC Ideas: https://lnkd.in/enXAepQ3
To know everything about GSoC and how it works, check out this talk:
🎥 GSoC Process Explained: https://lnkd.in/e_dM57bZ
Want to connect with maintainers? The LLM4S project is maintained by:
- Rory Graves - https://www.linkedin.com/in/roryjgraves/ | Email: rory.graves@fieldmark.co.uk | Discord:
rorybot1
- Kannupriya Kalra - https://www.linkedin.com/in/kannupriyakalra/ | Email: kannupriyakalra@gmail.com | Discord:
kannupriyakalra_46520
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.