8000 GitHub - jake-landersweb/gollm: Convenient interface for interacting with multiple LLM providers in Go
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

jake-landersweb/gollm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GoLanguageModel

Simple abstractions over LLM providers in Go to allow for complex LLM apps.

Currently supported LLMs:

  • OpenAI GPT3.5
  • OpenAI GPT4
  • Google Gemini
  • Anthropic Claude 2.1
  • Anthropic Claude Instant 1.2

LanguageModel Abstraction

NOTE THIS DOCUMENTATION IS A BIT OUT OF DATE. The functionality is the same, but there was a bit of re-architecture to support more config from the user.

The LLM abstraction allows for multiple LLMs to be used in the same conversation at different points. The LanguageModel object hosts the conversation state in a conversation object that is LLM agnostic, and when a specific LLM completion is called, the internal conversation state get transformed into the specific format for the LLM. Then, on response, the message is parsed from the LLM provider and stored in the agnostic state inside the LLM object.

This simple abstraction lets you mix and match different LLMs at any point of the conversation. For example, as seen in llm_test.go:

model := NewLanguageModel(TEST_USER_ID, logger, "You are a pirate on a deserted island")

var err error
input1 := &CompletionInput{
    Model:       GEMINI_MODEL,
    Temperature: 0.7,
    Json:        false,
    Input:       "Where is the treasure matey?",
}
_, err = model.TokenEstimate(input1)
assert.Nil(t, err)

// run a gpt completion
_, err = model.GeminiCompletion(ctx, input1)
assert.Nil(t, err)
if err != nil {
    return
}

input2 := &CompletionInput{
    Model:       GPT3_MODEL,
    Temperature: 1.3,
    Json:        false,
    Input:       "Are you sure? You must show me now or suffer!",
}
_, err = model.TokenEstimate(input2)
assert.Nil(t, err)

// run a gemini completion
_, err = model.GPTCompletion(ctx, input2)
assert.Nil(t, err)
if err != nil {
    return
}

input3 := &CompletionInput{
    Model:       ANTHROPIC_CLAUDE2,
    Temperature: 0.7,
    Json:        false,
    Input:       "Aha! Thats more like it! Treasure for everyone!",
}
_, err = model.TokenEstimate(input3)
assert.Nil(t, err)

// run an anthropic completion
_, err = model.AnthropicCompletion(ctx, input3)
assert.Nil(t, err)
if err != nil {
    return
}

model.PrintConversation()

assert.Equal(t, 7, len(model.conversation))

In this example, first the conversation is started with Gemini. Then, the conversation is extended with GPT 3.5. Lastly, the conversation is finished with Claude 2.1.

Resources

OpenAI

Gemini

Anthropic

Other

About

Convenient interface for interacting with multiple LLM providers in Go

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

0