Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/llm multi prompt #1070

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
63 changes: 52 additions & 11 deletions chains/llm.go
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ package chains

import (
"context"
"fmt"

"github.com/tmc/langchaingo/callbacks"
"github.com/tmc/langchaingo/llms"
Expand All @@ -11,14 +12,19 @@ import (
"github.com/tmc/langchaingo/schema"
)

const _llmChainDefaultOutputKey = "text"
const (
_llmChainDefaultOutputKey = "text"
_llmChainMultiPromptOutputKey = "choices"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it necessary to change the output key of the chain by default when using multipromt?

)

type LLMChain struct {
Prompt prompts.FormatPrompter
LLM llms.Model
Memory schema.Memory
CallbacksHandler callbacks.Handler
OutputParser schema.OutputParser[any]
// When enabled usesMultiplePrompts will not 'flatten' the prompt into a single message.
useMultiPrompt bool

OutputKey string
}
Expand All @@ -41,11 +47,17 @@ func NewLLMChain(llm llms.Model, prompt prompts.FormatPrompter, opts ...ChainCal
Memory: memory.NewSimple(),
OutputKey: _llmChainDefaultOutputKey,
CallbacksHandler: opt.CallbackHandler,
useMultiPrompt: false,
}

return chain
}

func (c *LLMChain) EnableMultiPrompt() {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe this should can be adjusted using a ChainCallOption?

c.useMultiPrompt = true
c.OutputKey = _llmChainMultiPromptOutputKey
}

// Call formats the prompts with the input values, generates using the llm, and parses
// the output from the llm with the output parser. This function should not be called
// directly, use rather the Call or Run function if the prompt only requires one input
Expand All @@ -56,17 +68,27 @@ func (c LLMChain) Call(ctx context.Context, values map[string]any, options ...Ch
return nil, err
}

result, err := llms.GenerateFromSinglePrompt(ctx, c.LLM, promptValue.String(), getLLMCallOptions(options...)...)
if err != nil {
return nil, err
}

finalOutput, err := c.OutputParser.ParseWithPrompt(result, promptValue)
if err != nil {
return nil, err
llmsOptions := getLLMCallOptions(options...)
var llmOutput any
if c.useMultiPrompt {
llmsReponse, err := c.LLM.GenerateContent(ctx, chatMessagesToLLmMessageContent(promptValue.Messages()), llmsOptions...)
if err != nil {
return nil, fmt.Errorf("llm generate content: %w", err)
}

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should be documented somewhere that output parsers does not work with multiprompt.

llmOutput = llmsReponse.Choices
} else {
resp, err := llms.GenerateFromSinglePrompt(ctx, c.LLM, promptValue.String(), llmsOptions...)
if err != nil {
return nil, err
}

llmOutput, err = c.OutputParser.ParseWithPrompt(resp, promptValue)
if err != nil {
return nil, err
}
}

return map[string]any{c.OutputKey: finalOutput}, nil
return map[string]any{c.OutputKey: llmOutput}, nil
}

// GetMemory returns the memory.
Expand All @@ -87,3 +109,22 @@ func (c LLMChain) GetInputKeys() []string {
func (c LLMChain) GetOutputKeys() []string {
return []string{c.OutputKey}
}

// Convert ChatMessage to MessageContent.
// Each ChatMessage is directly converted to a MessageContent with the same content and type.
func chatMessagesToLLmMessageContent(chatMessages []llms.ChatMessage) []llms.MessageContent {
msgs := make([]llms.MessageContent, len(chatMessages))
for idx, m := range chatMessages {
msgs[idx] = chatMessageToLLm(m)
}
return msgs
}

func chatMessageToLLm(in llms.ChatMessage) llms.MessageContent {
return llms.MessageContent{
Parts: []llms.ContentPart{
llms.TextContent{Text: in.GetContent()},
},
Role: in.GetType(),
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,6 @@ func main() {
log.Fatal(err)
}
fmt.Println(resp.Choices[0].Content)

}

// executeToolCalls executes the tool calls in the response and returns the
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,13 +9,11 @@ import (
)

func main() {

options := []llamafile.Option{
llamafile.WithEmbeddingSize(2048),
llamafile.WithTemperature(0.8),
}
llm, err := llamafile.New(options...)

if err != nil {
panic(err)
}
Expand All @@ -35,7 +33,6 @@ func main() {
fmt.Print(string(chunk))
return nil
}))

if err != nil {
panic(err)
}
Expand Down
35 changes: 35 additions & 0 deletions examples/llm-chain-multi-prompt-example/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# LLM Chain Multi Prompt Example

Welcome to this cheerful example of using LLM (Language Model) chains with LangChain in Go! 🎉

This example demonstrates how to create and use LLM chains for various natural language processing tasks. Let's dive in and see what exciting things we can do!

## What Does This Example Do?

1. **Company Name Generation** 🏢
- We create an LLM chain that generates a company name based on a product.
- It uses a simple prompt template: "What is a good name for a company that makes {{.product}}?"
- We run this chain with "socks" as input and get a creative company name suggestion!

2. **Text Translation** 🌍
- We set up another LLM chain for translating text between languages.
- The prompt template asks to translate from one language to another.
- We demonstrate translating "I love programming" from English to French.

## How It Works

1. We start by setting up an OpenAI LLM (Language Model).
2. For each task, we create a `PromptTemplate` with placeholders for inputs.
3. We then create `LLMChain` instances combining the LLM and the prompt templates.
4. For single-input chains, we use the `Run` function.
5. For multi-input chains, we use the `Call` function with a map of inputs.

## Running the Example

When you run this example, you'll see:
1. A suggested company name for a sock manufacturer.
2. The French translation of "I love programming".

It's a fun and practical demonstration of how LLM chains can be used for creative and linguistic tasks!

Happy coding, and enjoy exploring the world of LLM chains with Go! 🚀👨‍💻👩‍💻
39 changes: 39 additions & 0 deletions examples/llm-chain-multi-prompt-example/go.mod
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
module github.com/tmc/langchaingo/examples/llm-chain-multi-prompt-example

go 1.22.0

toolchain go1.22.1

require github.com/tmc/langchaingo v0.1.13-pre.0

require (
github.com/Masterminds/goutils v1.1.1 // indirect
github.com/Masterminds/semver/v3 v3.2.0 // indirect
github.com/Masterminds/sprig/v3 v3.2.3 // indirect
github.com/dlclark/regexp2 v1.10.0 // indirect
github.com/dustin/go-humanize v1.0.1 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/goph/emperror v0.17.2 // indirect
github.com/huandu/xstrings v1.3.3 // indirect
github.com/imdario/mergo v0.3.13 // indirect
github.com/json-iterator/go v1.1.12 // indirect
github.com/mitchellh/copystructure v1.0.0 // indirect
github.com/mitchellh/reflectwalk v1.0.0 // indirect
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
github.com/modern-go/reflect2 v1.0.2 // indirect
github.com/nikolalohinski/gonja v1.5.3 // indirect
github.com/pelletier/go-toml/v2 v2.0.9 // indirect
github.com/pkg/errors v0.9.1 // indirect
github.com/pkoukk/tiktoken-go v0.1.6 // indirect
github.com/shopspring/decimal v1.2.0 // indirect
github.com/sirupsen/logrus v1.9.3 // indirect
github.com/spf13/cast v1.3.1 // indirect
github.com/yargevad/filepathx v1.0.0 // indirect
go.starlark.net v0.0.0-20230302034142-4b1e35fe2254 // indirect
golang.org/x/crypto v0.23.0 // indirect
golang.org/x/exp v0.0.0-20230713183714-613f0c0eb8a1 // indirect
golang.org/x/sys v0.20.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)

replace github.com/tmc/langchaingo => ../..
Loading