Skip to content

Commit

Permalink
feat: change model on the fly (#415)
Browse files Browse the repository at this point in the history
* added a feature where the model parameter in openai_params can now be a function to allow dynamically changing the model. the readme has been updated appropriately. I have also added an example configuration with gpt-4-1106-preview to the readme after the configuration section.

* on opening chat window, the model was not being collapsed for the settings panel. now doing this and testing if it works

* testing using debug output

* removed debug out again because it seems to now work magically?

* debugging

* debugging

* still debugging

* if the model is determined by a function, just display <dynamic> in settings menu

* debugging

* still debugging

* had value, key instead of key, value in a for loop because I dont know lua lmao now testing

* seems to be working, testing it now

* debug output for model

* typo in toMessages function in settings.lua, fixed now

* more debugging

* still debugging :(

* vim.inspect missing

* the plugin is tested and working, you can now switch models dynamically and if this is enabled, it will say <dynamic> in the chat settings. the completion features etc have not been tested with the changes, but they should be unaffected as I did not touch the openai_completion_params etc, only openai_params. if you want to see currently active model, add a shortcut to your config (because your config manages the model)

* reformatted the config sample to be more readable. you can now pass a function as model to change the model used on the fly

* finally, removed all debug notifications

* Update api.lua

* removed a goto statement by refactoring the code because goto does not work for some stylua versions and that's really annoying and unnecessary

---------

Co-authored-by: Paper <[email protected]>
  • Loading branch information
barnii77 and PaperTarsier692 authored Jun 22, 2024
1 parent df53728 commit a8b5520
Show file tree
Hide file tree
Showing 5 changed files with 134 additions and 55 deletions.
43 changes: 41 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -140,11 +140,50 @@ or if you are using [lazy.nvim](https://github.com/folke/lazy.nvim):

## Configuration

`ChatGPT.nvim` comes with the following defaults, you can override them by
passing config as setup param
`ChatGPT.nvim` comes with the following defaults, you can override them by passing config as setup param

https://github.com/jackMort/ChatGPT.nvim/blob/f1453f588eb47e49e57fa34ac1776b795d71e2f1/lua/chatgpt/config.lua#L10-L182

### Example Configuration

A simple configuration of the chat model could look something like this:
```lua
{
"jackMort/ChatGPT.nvim",
event = "VeryLazy",
config = function()
require("chatgpt").setup({
-- this config assumes you have OPENAI_API_KEY environment variable set
openai_params = {
-- NOTE: model can be a function returning the model name
-- this is useful if you want to change the model on the fly
-- using commands
-- Example:
-- model = function()
-- if some_condition() then
-- return "gpt-4-1106-preview"
-- else
-- return "gpt-3.5-turbo"
-- end
-- end,
model = "gpt-4-1106-preview",
frequency_penalty = 0,
presence_penalty = 0,
max_tokens = 4095,
temperature = 0.2,
top_p = 0.1,
n = 1,
}
})
end,
dependencies = {
"MunifTanjim/nui.nvim",
"nvim-lua/plenary.nvim",
"nvim-telescope/telescope.nvim"
}
}
```

### Secrets Management

Providing the OpenAI API key via an environment variable is dangerous, as it
Expand Down
15 changes: 12 additions & 3 deletions lua/chatgpt/api.lua
Original file line number Diff line number Diff line change
@@ -1,16 +1,24 @@
local job = require("plenary.job")
local Config = require("chatgpt.config")
local logger = require("chatgpt.common.logger")
local Utils = require("chatgpt.utils")

local Api = {}

function Api.completions(custom_params, cb)
local params = vim.tbl_extend("keep", custom_params, Config.options.openai_params)
local openai_params = Utils.collapsed_openai_params(Config.options.openai_params)
local params = vim.tbl_extend("keep", custom_params, openai_params)
Api.make_call(Api.COMPLETIONS_URL, params, cb)
end

function Api.chat_completions(custom_params, cb, should_stop)
local params = vim.tbl_extend("keep", custom_params, Config.options.openai_params)
local openai_params = Utils.collapsed_openai_params(Config.options.openai_params)
local params = vim.tbl_extend("keep", custom_params, openai_params)
-- the custom params contains <dynamic> if model is not constant but function
-- therefore, use collapsed openai params (with function evaluated to get model) if that is the case
if params.model == "<dynamic>" then
params.model = openai_params.model
end
local stream = params.stream or false
if stream then
local raw_chunks = ""
Expand Down Expand Up @@ -90,7 +98,8 @@ function Api.chat_completions(custom_params, cb, should_stop)
end

function Api.edits(custom_params, cb)
local params = vim.tbl_extend("keep", custom_params, Config.options.openai_edit_params)
local openai_params = Utils.collapsed_openai_params(Config.options.openai_params)
local params = vim.tbl_extend("keep", custom_params, openai_params)
if params.model == "text-davinci-edit-001" or params.model == "code-davinci-edit-001" then
vim.notify("Edit models are deprecated", vim.log.levels.WARN)
Api.make_call(Api.EDITS_URL, params, cb)
Expand Down
93 changes: 45 additions & 48 deletions lua/chatgpt/code_edits.lua
Original file line number Diff line number Diff line change
Expand Up @@ -350,57 +350,54 @@ M.edit_with_instructions = function(output_lines, bufnr, selection, ...)
-- cycle windows
for _, popup in ipairs({ input_window, output_window, settings_panel, help_panel, instructions_input }) do
for _, mode in ipairs({ "n", "i" }) do
if mode == "i" and (popup == input_window or popup == output_window) then
goto continue
end

popup:map(mode, Config.options.edit_with_instructions.keymaps.cycle_windows, function()
-- #352 is a bug where active_panel is something not in here, maybe an
-- old window or something, lost amongst the global state
local possible_windows = {
input_window,
output_window,
settings_panel,
help_panel,
instructions_input,
unpack(open_extra_panels),
}

-- So if active_panel isn't something we expect it to be, make it do be.
if not inTable(possible_windows, active_panel) then
active_panel = instructions_input
end

local active_panel_is_in_extra_panels = inTable(open_extra_panels, active_panel)
if active_panel == instructions_input then
vim.api.nvim_set_current_win(input_window.winid)
active_panel = input_window
vim.api.nvim_command("stopinsert")
elseif active_panel == input_window and mode ~= "i" then
vim.api.nvim_set_current_win(output_window.winid)
active_panel = output_window
vim.api.nvim_command("stopinsert")
elseif active_panel == output_window and mode ~= "i" then
if #open_extra_panels == 0 then
vim.api.nvim_set_current_win(instructions_input.winid)
if not (mode == "i" and (popup == input_window or popup == output_window)) then
popup:map(mode, Config.options.edit_with_instructions.keymaps.cycle_windows, function()
-- #352 is a bug where active_panel is something not in here, maybe an
-- old window or something, lost amongst the global state
local possible_windows = {
input_window,
output_window,
settings_panel,
help_panel,
instructions_input,
unpack(open_extra_panels),
}

-- So if active_panel isn't something we expect it to be, make it do be.
if not inTable(possible_windows, active_panel) then
active_panel = instructions_input
else
vim.api.nvim_set_current_win(open_extra_panels[1].winid)
active_panel = open_extra_panels[1]
end
elseif active_panel_is_in_extra_panels then
-- next index with wrap around and 0 for instructions_input
local next_index = (active_panel_is_in_extra_panels + 1) % (#open_extra_panels + 1)
if next_index == 0 then
vim.api.nvim_set_current_win(instructions_input.winid)
active_panel = instructions_input
else
vim.api.nvim_set_current_win(open_extra_panels[next_index].winid)
active_panel = open_extra_panels[next_index]

local active_panel_is_in_extra_panels = inTable(open_extra_panels, active_panel)
if active_panel == instructions_input then
vim.api.nvim_set_current_win(input_window.winid)
active_panel = input_window
vim.api.nvim_command("stopinsert")
elseif active_panel == input_window and mode ~= "i" then
vim.api.nvim_set_current_win(output_window.winid)
active_panel = output_window
vim.api.nvim_command("stopinsert")
elseif active_panel == output_window and mode ~= "i" then
if #open_extra_panels == 0 then
vim.api.nvim_set_current_win(instructions_input.winid)
active_panel = instructions_input
else
vim.api.nvim_set_current_win(open_extra_panels[1].winid)
active_panel = open_extra_panels[1]
end
elseif active_panel_is_in_extra_panels then
-- next index with wrap around and 0 for instructions_input
local next_index = (active_panel_is_in_extra_panels + 1) % (#open_extra_panels + 1)
if next_index == 0 then
vim.api.nvim_set_current_win(instructions_input.winid)
active_panel = instructions_input
else
vim.api.nvim_set_current_win(open_extra_panels[next_index].winid)
active_panel = open_extra_panels[next_index]
end
end
end
end, {})
::continue::
end, {})
end
end
end

Expand Down
14 changes: 12 additions & 2 deletions lua/chatgpt/flows/chat/base.lua
Original file line number Diff line number Diff line change
Expand Up @@ -517,7 +517,7 @@ function Chat:toMessages()
role = "assistant"
end
local content = {}
if self.params.model == "gpt-4-vision-preview" then
if Utils.collapsed_openai_params(self.params).model == "gpt-4-vision-preview" then
for _, line in ipairs(msg.lines) do
table.insert(content, createContent(line))
end
Expand Down Expand Up @@ -736,7 +736,17 @@ function Chat:get_layout_params()
end

function Chat:open()
self.settings_panel = Settings.get_settings_panel("chat_completions", self.params)
local displayed_params = Utils.table_shallow_copy(self.params)
-- if the param is decided by a function and not constant, write <dynamic> for now
-- TODO: if the current model should be displayed, the settings_panel would
-- have to be constantly modified or rewritten to be able to manage a function
-- returning the model as well
for key, value in pairs(self.params) do
if type(value) == "function" then
displayed_params[key] = "<dynamic>"
end
end
self.settings_panel = Settings.get_settings_panel("chat_completions", displayed_params)
self.help_panel = Help.get_help_panel("chat")
self.sessions_panel = Sessions.get_panel(function(session)
self:set_session(session)
Expand Down
24 changes: 24 additions & 0 deletions lua/chatgpt/utils.lua
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,30 @@ local M = {}

local ESC_FEEDKEY = vim.api.nvim_replace_termcodes("<ESC>", true, false, true)

---@param tbl table
---@return table
function M.table_shallow_copy(tbl)
local copy = {}
for key, value in pairs(tbl) do
copy[key] = value
end
return copy
end

--- A function that collapses the openai params.
--- This means all the parameters of the openai_params that can be either constants or functions
--- will be set to constants by evaluating the functions.
---@param openai_params table
---@return table
function M.collapsed_openai_params(openai_params)
local collapsed = M.table_shallow_copy(openai_params)
-- use copied version of table so the original model value remains a function and can still change
if type(collapsed.model) == "function" then
collapsed.model = collapsed.model()
end
return collapsed
end

function M.split(text)
local t = {}
for str in string.gmatch(text, "%S+") do
Expand Down

0 comments on commit a8b5520

Please sign in to comment.