LLM4S provides a simple, robust, and scalable framework for building LLM applications in Scala. While most LLM work is done in Python, we believe that Scala offers a fundamentally better foundation for building reliable, maintainable AI-powered applications.
Note: This is a work in progress project and is likely to change significantly over time.
- Type Safety: Catch errors at compile time rather than runtime
- Functional Programming: Immutable data structures and pure functions for more predictable code
- JVM Ecosystem: Access to a vast array of libraries and tools
- Concurrency: Better handling of concurrent operations with Scala's actor model
- Performance: JVM performance with functional programming elegance
- Containerized Workspace: Secure execution environment for LLM-driven operations
- Workspace Agent Interface: Standardized protocol for file operations and command execution
- Multi-Provider Support: Planned support for multiple LLM providers (OpenAI, Anthropic, etc.)
- Agent Trace Logging: Detailed markdown logs of agent execution for debugging and analysis
- llm4s: Main project - contains the core LLM4S framework
- shared: Shared code between main project and workspace runner
- workspaceRunner: Code that performs the requested actions on the workspace within the docker container
- samples: Usage examples demonstrating various features
To get started with the LLM4S project, check out this teaser talk presented by Kannupriya Kalra at the Bay Area Scala Conference. This recording is essential for understanding where weβre headed:
π₯ Teaser Talk: https://www.youtube.com/watch?v=SXybj2P3_DE&ab_channel=SalarRahmanian
LLM4S was officially introduced at the Bay Area Scala Conference in San Francisco on February 25, 2025.
To ensure code quality, we use a Git pre-commit hook that automatically checks code formatting and runs tests before allowing commits:
# Install the pre-commit hook
./hooks/install.sh
# The hook will automatically:
# - Check code formatting with scalafmt
# - Compile code for both Scala 2.13 and 3
# - Run tests for both Scala versions
# To skip the hook temporarily (not recommended):
# git commit --no-verify
- JDK 21+
- SBT
- Docker (for containerized workspace)
sbt compile
# For all supported Scala versions (2.13 and 3)
sbt +compile
# Build and test all versions
sbt buildAll
You will need an API key for either OpenAI (https://platform.openai.com/) or Anthropic (https://console.anthropic.com/) other LLMS may be supported in the future (see the backlog).
Set the environment variables:
LLM_MODEL=openai/gpt-4o
OPENAI_API_KEY=<your_openai_api_key>
or Anthropic:
LLM_MODEL=anthropic/claude-3-7-sonnet-latest
ANTHROPIC_API_KEY=<your_anthropic_api_key>
or OpenRouter:
LLM_MODEL=openai/gpt-4o
OPENAI_API_KEY=<your_openai_api_key>
OPENAI_BASE_URL=https://openrouter.ai/api/v1
This will allow you to run the non-containerized examples.
# Using Scala 3
sbt "samples/runMain org.llm4s.samples.basic.BasicLLMCallingExample"
sbt docker:publishLocal
sbt "samples/runMain org.llm4s.samples.workspace.ContainerisedWorkspaceDemo"
# Using Scala 2.13
sbt ++2.13.14 "samples/runMain org.llm4s.samples.basic.BasicLLMCallingExample"
LLM4S supports both Scala 2.13 and Scala 3.3. The codebase is set up to handle version-specific code through source directories:
src/main/scala
- Common code for both Scala 2.13 and 3.3src/main/scala-2.13
- Scala 2.13 specific codesrc/main/scala-3
- Scala 3 specific code
When you need to use version-specific features, place the code in the appropriate directory.
We've added convenient aliases for cross-compilation:
# Compile for all Scala versions
sbt compileAll
# Test all Scala versions
sbt testAll
# Both compile and test
sbt buildAll
# Publish for all versions
sbt publishAll
We use specialized test projects to verify cross-version compatibility against the published artifacts. These tests ensure that the library works correctly across different Scala versions by testing against actual published JARs rather than local target directories.
# Run tests for both Scala 2 and 3 against published JARs
sbt testCross
# Full clean, publish, and test verification
sbt fullCrossTest
Note: For detailed information about our cross-testing strategy and setup, see crossTest/README.md
Our goal is to implement Scala equivalents of popular Python LLM frameworks:
- * Single API access to multiple LLM providers (like LiteLLM) - llmconnect
- A comprehensive toolchain for building LLM apps (like LangChain/LangGraph)
- * RAG search
- * tool calling
- * logging/tracking/monitoring
- * An agentic framework (like PydanticAI, CrewAI)
- Single agent
- Multi-agent
- Tokenization utilities (port of tiktoken)
- Examples/ support
- * Standard tool calling libraries
- * examples of all use-cases
- stable platform -tests etc
- Scala Coding SWE Agent - an agent that can do SWE bench type tasks on Scala codebases.
- code maps
- generation
- templates for library use?
Tool calling is a critical integration - we aim to make it as simple as possible:
Using ScalaMeta to automatically generate tool definitions from Scala methods:
/** My tool does some funky things with a & b...
* @param a The first thing
* @param b The second thing
*/
def myTool(a: Int, b: String): ToolResponse = {
// Implementation
}
ScalaMeta extracts method parameters, types, and documentation to generate OpenAI-compatible tool definitions.
Mapping LLM tool call requests to actual method invocations through:
- Code generation
- Reflection-based approaches
- ScalaMeta-based parameter mapping
Tools run in a protected Docker container environment to prevent accidental system damage or data leakage.
LLM4S provides a powerful, configurable tracing system for monitoring, debugging, and analyzing your LLM applications with support for multiple backends.
Configure tracing behavior using the TRACING_MODE
environment variable:
# Send traces to Langfuse (default)
TRACING_MODE=langfuse
LANGFUSE_PUBLIC_KEY=pk-lf-your-key
LANGFUSE_SECRET_KEY=sk-lf-your-secret
# Print detailed traces to console with colors and token usage
TRACING_MODE=print
# Disable tracing completely
TRACING_MODE=none
import org.llm4s.trace.Tracing
// Create tracer based on TRACING_MODE environment variable
val tracer = Tracing.create()
// Trace events, completions, and token usage
tracer.traceEvent("Starting LLM operation")
tracer.traceCompletion(completion, model)
tracer.traceTokenUsage(tokenUsage, model, "chat-completion")
tracer.traceAgentState(agentState)
See the talks being given by maintainers and open source developers globally and witness the engagement by developers around the world.
Stay updated with talks, workshops, and presentations about LLM4S happening globally. These sessions dive into the architecture, features, and future plans of the project.
Snapshots from LLM4S talks held around the world π.
Date | Event/Conference | Talk Title | Location | Speaker Name | Details URL | Recording Link URL |
---|---|---|---|---|---|---|
25-Feb-2025 | Bay Area Scala | Let's Teach LLMs to Write Great Scala! (Original version) | Tubi office, San Francisco, CA, USA | Kannupriya Kalra | Event Info Reddit Discussion Mastodon Post Bluesky Post X/Twitter Post |
Watch Recording |
20-Apr-2025 | Scala India | Let's Teach LLMs to Write Great Scala! (Updated from Feb 2025) | India | Kannupriya Kalra | Event Info Reddit Discussion X/Twitter Post |
Watch Recording |
28-May-2025 | Functional World 2025 by Scalac | Let's Teach LLMs to Write Great Scala! (Updated from Apr 2025) | Gdansk, Poland | Kannupriya Kalra | LinkedIn Post 1 LinkedIn Post 2 Reddit Discussion Meetup Link X/Twitter Post Scalendar Newsletter |
Watch Recording |
13-Jun-2025 | Dallas Scala Enthusiasts | Let's Teach LLMs to Write Great Scala! (Updated from May 2025) | Dallas, Texas, USA | Kannupriya Kalra | Meetup Event Scalendar June Newsletter LinkedIn Post X/Twitter Post Reddit Discussion Bluesky Post Mastodon Post |
Recording will be posted once the event is done |
21-Aug-2025 | Scala Days 2025 | Scala Meets GenAI: Build the Cool Stuff with LLM4S | SwissTech Convention Center, EPFL (Γcole Polytechnique FΓ©dΓ©rale de Lausanne) campus, Lausanne, Switzerland | Kannupriya Kalra, Rory Graves | Talk Info | Recording will be posted once the event is done |
π Want to invite us for a talk or workshop? Reach out via our respective emails or connect on Discord: https://discord.gg/4uvTPn6qww
- Build AI-powered applications in a statically typed, functional language designed for large systems.
- Help shape the Scala ecosystemβs future in the AI/LLM space.
- Learn modern LLM techniques like zero-shot prompting, tool calling, and agentic workflows.
- Collaborate with experienced Scala engineers and open-source contributors.
- Gain real-world experience working with Dockerized environments and multi-LLM providers.
- Contribute to a project that offers you the opportunity to become a mentor or contributor funded by Google through its Google Summer of Code (GSoC) program.
- Join a global developer community focused on type-safe, maintainable AI systems.
Interested in contributing? Start here:
LLM4S GitHub Issues: https://lnkd.in/eXrhwgWY
Want to be part of developing this and interact with other developers? Join our Discord community!
LLM4S Discord: https://lnkd.in/eb4ZFdtG
LLM4S was selected for GSoC 2025 under the Scala Center Organisation.
This project is also participating in Google Summer of Code (GSoC) 2025! If you're interested in contributing to the project as a contributor, check out the details here:
π Scala Center 928F GSoC Ideas: https://lnkd.in/enXAepQ3
To know everything about GSoC and how it works, check out this talk:
π₯ GSoC Process Explained: https://lnkd.in/e_dM57bZ
To learn about the experience of GSoC contributors of LLM4S, check out their blogs in the section below.
π Explore Past GSoC Projects with Scala Center: https://www.gsocorganizations.dev/organization/scala-center/
This page includes detailed information on all GSoC projects with Scala Center from past years β including project descriptions, code repositories, contributor blogs, and mentor details.
Hello GSoCers and future GSoC aspirants! Here are some essential onboarding links to help you collaborate and stay organized within the LLM4S community.
-
π LLM4S GSoC GitHub Team:
You have been invited to join the LLM4S GitHub team for GSoC participants. Accepting this invite will grant you access to internal resources and coordination tools.
π https://github.com/orgs/llm4s/teams/gsoc/members -
π Private GSoC Project Tracking Board:
Once you're part of the team, you will have access to our private GSoC tracking board. This board helps you track tasks, timelines, and project deliverables throughout the GSoC period.
π https://github.com/orgs/llm4s/projects/3
- Contributor: Elvan Konukseven
- LinkedIn: https://www.linkedin.com/in/elvan-konukseven/ | Email: elvankonukseven0@gmail.com | Discord:
elvan_31441
- Mentors: Kannupriya Kalra (Email: kannupriyakalra@gmail.com), Rory Graves (Email: rory.graves@fieldmark.co.uk)
- Announcement: Official Acceptance Post
- Contributor Blogs: π elvankonukseven.com/blog
- Work log: π GitHub Project Board
- Contributor: Gopi Trinadh Maddikunta
- LinkedIn: https://www.linkedin.com/in/gopitrinadhmaddikunta/ | Email: trinadh7341@gmail.com | Discord:
g3nadh_58439
- Mentors: Kannupriya Kalra (Email: kannupriyakalra@gmail.com), Rory Graves (Email: rory.graves@fieldmark.co.uk), Dmitry Mamonov (Email: dmitry.s.mamonov@gmail.com)
- Announcement: Official Acceptance Post
- Contributor Blogs: π Main Blog | π Scala at Light Speed β Part 1 | π Scala at Light Speed β Part 2
- Work log: π Work Log β GitHub Project
- Contributor: Anshuman Awasthi
- LinkedIn: https://www.linkedin.com/in/let-me-try-to-fork-your-responsibilities/ | Email: mcs23026@iiitl.ac.in | Discord:
anshuman23026
- Mentors: Kannupriya Kalra (Email: kannupriyakalra@gmail.com), Rory Graves (Email: rory.graves@fieldmark.co.uk)
- Announcement: Official Acceptance Post
- Contributor Blogs: π Anshuman's GSoC Journey
- Work Log: π GitHub Project Board
- Contributor: Shubham Vishwakarma
- LinkedIn: https://www.linkedin.com/in/shubham-vish/ | Email: smsharma3121@gmail.com | Discord:
oxygen4076
- Mentors: Kannupriya Kalra (Email: kannupriyakalra@gmail.com), Rory Graves (Email: rory.graves@fieldmark.co.uk), Dmitry Mamonov (Email: dmitry.s.mamonov@gmail.com)
- Announcement: Official Acceptance Post
- Contributor Blogs: π Cracking the Code: My GSoC 2025 Story
- Work log: π GitHub Project Board
Feel free to reach out to the contributors or mentors listed for any guidance or questions related to GSoC 2026.
Want to connect with maintainers? The LLM4S project is maintained by:
- Rory Graves - https://www.linkedin.com/in/roryjgraves/ | Email: rory.graves@fieldmark.co.uk | Discord:
rorybot1
- Kannupriya Kalra - https://www.linkedin.com/in/kannupriyakalra/ | Email: kannupriyakalra@gmail.com | Discord:
kannupriyakalra_46520
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.