8000 GitHub - Vi-shub/llm4s: Agentic and LLM Programming in Scala
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
/ llm4s Public
forked from llm4s/llm4s

Agentic and LLM Programming in Scala

License

Notifications You must be signed in to change notification settings

Vi-shub/llm4s

 
 

Repository files navigation

LLM4S - Large Language Models for Scala

Overview

LLM4S provides a simple, robust, and scalable framework for building LLM applications in Scala. While most LLM work is done in Python, we believe that Scala offers a fundamentally better foundation for building reliable, maintainable AI-powered applications.

Note: This is a work in progress project and is likely to change significantly over time.

Why Scala for LLMs?

  • Type Safety: Catch errors at compile time rather than runtime
  • Functional Programming: Immutable data structures and pure functions for more predictable code
  • JVM Ecosystem: Access to a vast array of libraries and tools
  • Concurrency: Better handling of concurrent operations with Scala's actor model
  • Performance: JVM performance with functional programming elegance

Features

  • Containerized Workspace: Secure execution environment for LLM-driven operations
  • Workspace Agent Interface: Standardized protocol for file operations and command execution
  • Multi-Provider Support: Planned support for multiple LLM providers (OpenAI, Anthropic, etc.)

Project Structure

  • llm4s: Main project - contains the core LLM4S framework
  • shared: Shared code between main project and workspace runner
  • workspaceRunner: Code that performs the requested actions on the workspace within the docker container
  • samples: Usage examples demonstrating various features

Getting Started

To get started with the LLM4S project, check out this teaser talk presented by Kannupriya Kalra at the Bay Area Scala Conference. This recording is essential for understanding where we’re headed:

🎥 Teaser Talk: https://www.youtube.com/watch?v=SXybj2P3_DE&ab_channel=SalarRahmanian

Prerequisites

  • JDK 21+
  • SBT
  • Docker (for containerized workspace)

Building the Project

sbt compile

Running the Examples

sbt "samples/runMain org.llm4s.samples.basic.BasicLLMCallingExample"
sbt "samples/runMain org.llm4s.samples.workspace.ContainerisedWorkspaceDemo"

Roadmap

Our goal is to implement Scala equivalents of popular Python LLM frameworks:

  • Single API access to multiple LLM providers (like LiteLLM)
  • A comprehensive toolchain for building LLM apps (like LangChain/LangGraph)
  • An agentic framework (like PydanticAI, CrewAI)
  • Tokenization utilities (port of tiktoken)
  • Full ReAct loop implementation
  • Simple tool calling mechanism

Tool Calling

Tool calling is a critical integration - we aim to make it as simple as possible:

Tool Signature Generation

Using ScalaMeta to automatically generate tool definitions from Scala methods:

/** My tool does some funky things with a & b...
 * @param a The first thing
 * @param b The second thing
 */
def myTool(a: Int, b: String): ToolResponse = {
  // Implementation
}

ScalaMeta extracts method parameters, types, and documentation to generate OpenAI-compatible tool definitions.

Tool Call Mapping

Mapping LLM tool call requests to actual method invocations through:

  • Code generation
  • Reflection-based approaches
  • ScalaMeta-based parameter mapping

Secure Execution

Tools run in a protected Docker container environment to prevent accidental system damage or data leakage.

Contributing

Interested in contributing? Start here:

LLM4S GitHub Issues: https://lnkd.in/eXrhwgWY

Join the Community

Want to be part of developing this and interact with other developers? Join our Discord community!

LLM4S Discord: https://lnkd.in/eb4ZFdtG

Google Summer of Code 2025

This project is also participating in Google Summer of Code (GSoC) 2025! If you're interested in contributing to the project as a contributor, check out the details here:

👉 Scala Center GSoC Ideas: https://lnkd.in/enXAepQ3

To know everything about GSoC and how it works, check out this talk:

🎥 GSoC Process Explained: https://lnkd.in/e_dM57bZ

Maintainers

Want to connect with maintainers? The LLM4S project is maintained by:

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

About

Agentic and LLM Programming in Scala

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Scala 99.9%
  • Shell 0.1%
0