Skip to content

Commit

Permalink
Add support for default sticky prompts
Browse files Browse the repository at this point in the history
See #716

Signed-off-by: Tomas Slusny <[email protected]>
  • Loading branch information
deathbeam committed Feb 5, 2025
1 parent 40b4e36 commit c7afeea
Show file tree
Hide file tree
Showing 3 changed files with 35 additions and 2 deletions.
15 changes: 14 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -224,6 +224,17 @@ List all files in the workspace
What is 1 + 11
```

You can also set default sticky prompts in the configuration:

```lua
{
sticky = {
'@models Using Mistral-small',
'#files:full',
}
}
```

## Models

You can list available models with `:CopilotChatModels` command. Model determines the AI model used for the chat.
Expand Down Expand Up @@ -417,11 +428,13 @@ Also see [here](/lua/CopilotChat/config.lua):
-- Shared config starts here (can be passed to functions at runtime and configured via setup function)

system_prompt = prompts.COPILOT_INSTRUCTIONS, -- System prompt to use (can be specified manually in prompt via /).

model = 'gpt-4o', -- Default model to use, see ':CopilotChatModels' for available models (can be specified manually in prompt via $).
agent = 'copilot', -- Default agent to use, see ':CopilotChatAgents' for available agents (can be specified manually in prompt via @).
context = nil, -- Default context or array of contexts to use (can be specified manually in prompt via #).
temperature = 0.1, -- GPT result temperature
sticky = nil, -- Default sticky prompt or array of sticky prompts to use at start of every new chat.

temperature = 0.1, -- GPT result temperature
headless = false, -- Do not write to chat buffer and use history(useful for using callback for custom processing)
callback = nil, -- Callback to use when ask response is received

Expand Down
5 changes: 4 additions & 1 deletion lua/CopilotChat/config.lua
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,7 @@ local utils = require('CopilotChat.utils')
---@field model string?
---@field agent string?
---@field context string|table<string>|nil
---@field sticky string|table<string>|nil
---@field temperature number?
---@field headless boolean?
---@field callback fun(response: string, source: CopilotChat.source)?
Expand Down Expand Up @@ -90,11 +91,13 @@ return {
-- Shared config starts here (can be passed to functions at runtime and configured via setup function)

system_prompt = prompts.COPILOT_INSTRUCTIONS, -- System prompt to use (can be specified manually in prompt via /).

model = 'gpt-4o', -- Default model to use, see ':CopilotChatModels' for available models (can be specified manually in prompt via $).
agent = 'copilot', -- Default agent to use, see ':CopilotChatAgents' for available agents (can be specified manually in prompt via @).
context = nil, -- Default context or array of contexts to use (can be specified manually in prompt via #).
temperature = 0.1, -- GPT result temperature
sticky = nil, -- Default sticky prompt or array of sticky prompts to use at start of every new chat.

temperature = 0.1, -- GPT result temperature
headless = false, -- Do not write to chat buffer and use history(useful for using callback for custom processing)
callback = nil, -- Callback to use when ask response is received

Expand Down
17 changes: 17 additions & 0 deletions lua/CopilotChat/init.lua
Original file line number Diff line number Diff line change
Expand Up @@ -315,6 +315,23 @@ local function finish(start_of_chat)

state.chat:append(M.config.question_header .. M.config.separator .. '\n\n')

-- Add default sticky prompts after reset
if start_of_chat then
if M.config.sticky then
local last_prompt = state.last_prompt or ''

if type(M.config.sticky) == 'table' then
for _, sticky in ipairs(M.config.sticky) do
last_prompt = last_prompt .. '\n> ' .. sticky
end
else
last_prompt = last_prompt .. '\n> ' .. M.config.sticky
end

state.last_prompt = last_prompt
end
end

-- Reinsert sticky prompts from last prompt
if state.last_prompt then
local has_sticky = false
Expand Down

0 comments on commit c7afeea

Please sign in to comment.