Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor the plugin to be lua-based #83

Merged
merged 7 commits into from
Feb 28, 2024

Conversation

deathbeam
Copy link
Collaborator

@deathbeam deathbeam commented Feb 27, 2024

  • Rewrite copilot class to lua
  • Implement vsplit chat in lua
  • Move prompts to lua
  • Move commands to lua

Changes

  • Plugin now depends on https://github.com/nvim-lua/plenary.nvim (mainly for curl module)
  • Removed vlog code replaced it with plenary.log as it was copied from there anyway
  • Copilot jobs are now properly terminated on window close/window loss and new chat prompts
  • .ask and .open can accept config as second param to override some parts of the config per command (for example chat.ask('Hello', { window = { layout = 'float' } }))
    • Prompts can now be defined as tables (name = prompt still works too) and when defined as table you can specify config just for that prompt (similar as chat.ask/open). mapping is also supported,that automatically creates keybiding for that prompt
  • Merged some of InPlace chat features with VSplit chat features so now vsplit chat:
    • Supports different layouts (horizontal,vertical split and floating window)
    • Now you can type to the chat and submit a prompt
    • Now you can make code replacement from the chat
    • Now you can trigger completion for user and system prompts when pressig tab after /. You can also use / syntax to enter shortcuts to your prompts directly or to use different system prompt
    • All the keybindings on chat are now configurable
    • There is now help as virtual text in the split chat
    • Spinner is now as virtual text at end of written text by copilot
    • Separators are now different due to support for submitting prompts (its just ---, e.g markdown horizontal rule, configurable)
  • Did not reimplemented InPlace chat in lua as vsplit chat with some of the in place interactive features is better design imo so did not wanted to spend energy on it
  • Did not reimplemented requesting authentication in Lua, it only gets the token now. Imo its better to let plugins like copilot.vim developed by tpope and microsoft to handle the initial auth. But if this is not desired I can move the python part as well as its not hard
  • Removed CopilotChatVisual and CopilotChatBuffer, this can be handled with selectors instead
  • By default CopilotChat now behaves like CopilotChatVisual, with configurable selector for unnamed buffer. It looked like unnamed buffer was workaround with some selection issues anyway but with new selector system I can just pass whatever data I want to the commands

TODO

  • For proxy support, this is needed: Add support for curl insecure and curl proxy nvim-lua/plenary.nvim#559
  • Delete rest of the python code? Or finish rewriting in place then delete
  • Check for curl availability with health check
  • Add folds logic from python, maybe? Not sure if this is even needed
  • As said in changes part, finish rewriting the authentication request if needed
  • Properly get token file path, atm it only supports Linux (easy fix)
  • Update README and stuff
  • Add token count support to extra_info, something like this called from lua:
@pynvim.function("CopilotChatTiktokenLen", sync=true)
def CopilotChatTiktokenLen(self, args)
   import tiktoken
   input = args[0]
   model = args[1]
   return len(tiktoken.encoding_for_model(model).encode(input))
end

New config example

local chat = require('CopilotChat')
local select = require('CopilotChat.select')
local prompts = require('CopilotChat.prompts')

-- This is just default config + some custom prompts showing mappings
chat.setup {
  system_prompt = prompts.COPILOT_INSTRUCTIONS,
  model = 'gpt-4',
  temperature = 0.1,
  debug = false,
  clear_chat_on_new_prompt = false,
  disable_extra_info = true,
  name = 'CopilotChat',
  separator = '---',
  prompts = {
    FixDiagnostic = {
        prompt = 'Please assist with the following diagnostic issue in file:',
        selection = select.diagnostics,
        mapping = '<leader>ar',
    },
    Explain = {
        prompt = prompts.USER_EXPLAIN, -- You can also just use prompt value straght from prompts module
        mapping = '<leader>ae',
    },
    Tests = {
        prompt = '/USER_TESTS', -- You can use references like this, works even outside of completion
        mapping = '<leader>at',
    },
    Documentation = {
        prompt = 'Add documentation comments to the selected code.',
        mapping = '<leader>ad',
    },
    Fix = {
        system_prompt = prompts.COPILOT_FIX, -- Or override system prompt like this
        prompt = 'Propose a fix for the problems in the selected code.',
        mapping = '<leader>af',
    },
    Optimize = {
        prompt = '/COPILOT_DEVELOPER Optimize the selected code to improve performance and readablilty.', -- Or use system prompt reference in prompt
        mapping = '<leader>ao',
    },
    Simplify = {
        prompt = 'Simplify the selected code and improve readablilty',
        mapping = '<leader>as',
    },

  },
  selection = function()
    return select.visual() or select.line()
  end,
  window = {
    layout = 'vertical',
    width = 0.8,
    height = 0.6,
    border = 'single',
    title = 'Copilot Chat',
  },
  mappings = {
    close = 'q',
    reset = '<C-l>',
    complete = '<Tab>',
    submit_prompt = '<CR>',
    submit_code = '<C-y>',
  },
}

vim.keymap.set({ 'n', 'v' }, '<leader>aa', chat.toggle, { desc = 'AI Toggle' })
vim.keymap.set({ 'n', 'v' }, '<leader>ax', chat.reset, { desc = 'AI Reset' })

Media

Floating window

image

Completion

image

deathbeam added a commit to deathbeam/dotfiles that referenced this pull request Feb 27, 2024
@deathbeam
Copy link
Collaborator Author

Also added support for prompt.mapping + example

@deathbeam
Copy link
Collaborator Author

Added some more examples to prompts with overriding and references

@gptlang
Copy link
Member

gptlang commented Feb 27, 2024

Thank you for your work!

Add token count support to extra_info, something like this called from lua

Perhaps we could strip Python entirely and use tiktoken as a shared object (it's written in rust: https://github.com/openai/tiktoken/blob/main/src/lib.rs)

@gptlang
Copy link
Member

gptlang commented Feb 27, 2024

I'll do some experiments and see if it works

@gptlang
Copy link
Member

gptlang commented Feb 27, 2024

Notes:

@gptlang
Copy link
Member

gptlang commented Feb 27, 2024

Looks like I need to refactor the library to use functional programming to pass around state.

Reference: mlua-rs/mlua#130

@jellydn jellydn changed the base branch from main to canary February 27, 2024 23:34
@gptlang
Copy link
Member

gptlang commented Feb 28, 2024

WIP here: https://github.com/gptlang/tiktoken.lua

Copy link
Contributor

@jellydn jellydn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great. Amazing work @deathbeam Let me merge to canary and test. Thanks.

@jellydn jellydn merged commit 842b71a into CopilotC-Nvim:canary Feb 28, 2024
1 check passed
gptlang pushed a commit that referenced this pull request Feb 28, 2024
* Refactor the plugin to be lua-based

Signed-off-by: Tomas Slusny <[email protected]>

* Add open/close/toggle commands and function

* Add support for prompt.mapping to map prompts to keys

* Fix issue with system_prompt replace not using correct value and fix naming of USER_ prompts

* Move some of chat buffer logic to separate file, allow changing window layout properly

* Disable python part of the plugin for now

* Fix check if message is copilot message

Signed-off-by: Tomas Slusny <[email protected]>

---------

Signed-off-by: Tomas Slusny <[email protected]>
@deathbeam deathbeam deleted the lua-refactor branch March 7, 2024 22:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants