Skip to content

cebbot/CopilotChat.nvim

 
 

Repository files navigation

Copilot Chat for Neovim

Documentation pre-commit.ci Discord Dotfyle All Contributors

Note

Plugin was rewritten to Lua from Python. Please check the migration guide from version 1 to version 2 for more information.

Prerequisites

Ensure you have the following installed:

  • Neovim stable (0.9.5) or nightly.

Optional:

  • tiktoken_core: sudo luarocks install --lua-version 5.1 tiktoken_core. Alternatively, download a pre-built binary from lua-tiktoken releases
  • You can check your Lua PATH in Neovim by doing :lua print(package.cpath). Save the binary as tiktoken_core.so in any of the given paths.

For Arch Linux user, you can install luajit-tiktoken-bin or lua51-tiktoken-bin from aur!

Installation

Lazy.nvim

return {
  {
    "CopilotC-Nvim/CopilotChat.nvim",
    branch = "canary",
    dependencies = {
      { "zbirenbaum/copilot.lua" }, -- or github/copilot.vim
      { "nvim-lua/plenary.nvim" }, -- for curl, log wrapper
    },
    build = "make tiktoken", -- Only on MacOS or Linux
    opts = {
      -- See Configuration section for options
    },
    -- See Commands section for default commands if you want to lazy load on them
  },
}

See @jellydn for configuration

Vim-Plug

Similar to the lazy setup, you can use the following configuration:

call plug#begin()
Plug 'zbirenbaum/copilot.lua'
Plug 'nvim-lua/plenary.nvim'
Plug 'CopilotC-Nvim/CopilotChat.nvim', { 'branch': 'canary' }
call plug#end()

lua << EOF
require("CopilotChat").setup {
  -- See Configuration section for options
}
EOF

Manual

  1. Put the files in the right place
mkdir -p ~/.config/nvim/pack/copilotchat/start
cd ~/.config/nvim/pack/copilotchat/start

git clone https://github.com/zbirenbaum/copilot.lua
git clone https://github.com/nvim-lua/plenary.nvim

git clone -b canary https://github.com/CopilotC-Nvim/CopilotChat.nvim
  1. Add to your configuration (e.g. ~/.config/nvim/init.lua)
require("CopilotChat").setup {
  -- See Configuration section for options
}

See @deathbeam for configuration

Post-Installation

Verify "Copilot chat in the IDE" is enabled.

Usage

Commands

  • :CopilotChat <input>? - Open chat window with optional input
  • :CopilotChatOpen - Open chat window
  • :CopilotChatClose - Close chat window
  • :CopilotChatToggle - Toggle chat window
  • :CopilotChatStop - Stop current copilot output
  • :CopilotChatReset - Reset chat window
  • :CopilotChatSave <name>? - Save chat history to file
  • :CopilotChatLoad <name>? - Load chat history from file
  • :CopilotChatDebugInfo - Show debug information
  • :CopilotChatModels - View and select available models. This is reset when a new instance is made. Please set your model in init.lua for persistence.
  • :CopilotChatAgents - View and select available agents. This is reset when a new instance is made. Please set your agent in init.lua for persistence.

Prompts

You can ask Copilot to do various tasks with prompts. You can reference prompts with /PromptName in chat or call with command :CopilotChat<PromptName>.
Default prompts are:

  • Explain - Write an explanation for the selected code and diagnostics as paragraphs of text
  • Review - Review the selected code
  • Fix - There is a problem in this code. Rewrite the code to show it with the bug fixed
  • Optimize - Optimize the selected code to improve performance and readability
  • Docs - Please add documentation comments to the selected code
  • Tests - Please generate tests for my code
  • Commit - Write commit message for the change with commitizen convention

System Prompts

System prompts specify the behavior of the AI model. You can reference system prompts with /PROMPT_NAME in chat. Default system prompts are:

  • COPILOT_INSTRUCTIONS - Base GitHub Copilot instructions
  • COPILOT_EXPLAIN - On top of the base instructions adds coding tutor behavior
  • COPILOT_REVIEW - On top of the base instructions adds code review behavior with instructions on how to generate diagnostics
  • COPILOT_GENERATE - On top of the base instructions adds code generation behavior, with predefined formatting and generation rules

Sticky Prompts

You can set sticky prompt in chat by prefixing the text with > using markdown blockquote syntax.
The sticky prompt will be copied at start of every new prompt in chat window. You can freely edit the sticky prompt, only rule is > prefix at beginning of line.
This is useful for preserving stuff like context and agent selection (see below).
Example usage:

> #files

List all files in the workspace
> @models Using Mistral-small

What is 1 + 11

Models

You can list available models with :CopilotChatModels command. Model determines the AI model used for the chat.
You can set the model in the prompt by using $ followed by the model name.
Default models are:

  • gpt-4o - This is the default Copilot Chat model. It is a versatile, multimodal model that excels in both text and image processing and is designed to provide fast, reliable responses. It also has superior performance in non-English languages. Gpt-4o is hosted on Azure.
  • claude-3.5-sonnet - This model excels at coding tasks across the entire software development lifecycle, from initial design to bug fixes, maintenance to optimizations. GitHub Copilot uses Claude 3.5 Sonnet hosted on Amazon Web Services.
  • o1-preview - This model is focused on advanced reasoning and solving complex problems, in particular in math and science. It responds more slowly than the gpt-4o model. You can make 10 requests to this model per day. o1-preview is hosted on Azure.
  • o1-mini - This is the faster version of the o1-preview model, balancing the use of complex reasoning with the need for faster responses. It is best suited for code generation and small context operations. You can make 50 requests to this model per day. o1-mini is hosted on Azure.

For more information about models, see here
You can use more models from here by using @models agent from here (example: @models Using Mistral-small, what is 1 + 11)

Agents

Agents are used to determine the AI agent used for the chat. You can list available agents with :CopilotChatAgents command.
You can set the agent in the prompt by using @ followed by the agent name.
Default "noop" agent is copilot.

For more information about extension agents, see here
You can install more agents from here

Contexts

Contexts are used to determine the context of the chat.
You can set the context in the prompt by using # followed by the context name.
If context supports input, you can set the input in the prompt by using : followed by the input (or pressing complete key after :).
Default contexts are:

  • buffer - Includes specified buffer in chat context (default current). Supports input.
  • buffers - Includes all buffers in chat context (default listed). Supports input.
  • file - Includes content of provided file in chat context. Supports input.
  • files - Includes all non-hidden filenames in the current workspace in chat context. Supports input.
  • git - Includes current git diff in chat context (default unstaged). Supports input.

API

local chat = require("CopilotChat")

-- Open chat window
chat.open()

-- Open chat window with custom options
chat.open({
  window = {
    layout = 'float',
    title = 'My Title',
  },
})

-- Close chat window
chat.close()

-- Toggle chat window
chat.toggle()

-- Toggle chat window with custom options
chat.toggle({
  window = {
    layout = 'float',
    title = 'My Title',
  },
})

-- Reset chat window
chat.reset()

-- Ask a question
chat.ask("Explain how it works.")

-- Ask a question with custom options
chat.ask("Explain how it works.", {
  selection = require("CopilotChat.select").buffer,
})

-- Ask a question and do something with the response
chat.ask("Show me something interesting", {
  callback = function(response)
    print("Response:", response)
  end,
})

-- Get all available prompts (can be used for integrations like fzf/telescope)
local prompts = chat.prompts()

-- Get last copilot response (also can be used for integrations and custom keymaps)
local response = chat.response()

-- Pick a prompt using vim.ui.select
local actions = require("CopilotChat.actions")

-- Pick prompt actions
actions.pick(actions.prompt_actions({
    selection = require("CopilotChat.select").visual,
}))

-- Programmatically set log level
chat.log_level("debug")

Configuration

Default configuration

Also see here:

{
  debug = false, -- Enable debug logging (same as 'log_level = 'debug')
  log_level = 'info', -- Log level to use, 'trace', 'debug', 'info', 'warn', 'error', 'fatal'
  proxy = nil, -- [protocol://]host[:port] Use this proxy
  allow_insecure = false, -- Allow insecure server connections

  system_prompt = prompts.COPILOT_INSTRUCTIONS, -- System prompt to use (can be specified manually in prompt via /).
  model = 'gpt-4o', -- Default model to use, see ':CopilotChatModels' for available models (can be specified manually in prompt via $).
  agent = 'copilot', -- Default agent to use, see ':CopilotChatAgents' for available agents (can be specified manually in prompt via @).
  context = nil, -- Default context to use (can be specified manually in prompt via #).
  temperature = 0.1, -- GPT result temperature

  question_header = '## User ', -- Header to use for user questions
  answer_header = '## Copilot ', -- Header to use for AI answers
  error_header = '## Error ', -- Header to use for errors
  separator = '───', -- Separator to use in chat

  chat_autocomplete = true, -- Enable chat autocompletion (when disabled, requires manual `mappings.complete` trigger)
  show_folds = true, -- Shows folds for sections in chat
  show_help = true, -- Shows help message as virtual lines when waiting for user input
  auto_follow_cursor = true, -- Auto-follow cursor in chat
  auto_insert_mode = false, -- Automatically enter insert mode when opening window and on new prompt
  insert_at_end = false, -- Move cursor to end of buffer when inserting text
  clear_chat_on_new_prompt = false, -- Clears chat on every new prompt
  highlight_selection = true, -- Highlight selection in the source buffer when in the chat window
  highlight_headers = true, -- Highlight headers in chat, disable if using markdown renderers (like render-markdown.nvim)

  history_path = vim.fn.stdpath('data') .. '/copilotchat_history', -- Default path to stored history
  callback = nil, -- Callback to use when ask response is received

  -- default selection
  selection = function(source)
    return select.visual(source) or select.buffer(source)
  end,

  -- default contexts
  contexts = {
    buffer = {
      -- see config.lua for implementation
    },
    buffers = {
      -- see config.lua for implementation
    },
    file = {
      -- see config.lua for implementation
    },
    files = {
      -- see config.lua for implementation
    },
    git = {
      -- see config.lua for implementation
    },
  },

  -- default prompts
  prompts = {
    Explain = {
      prompt = '> /COPILOT_EXPLAIN\n\nWrite an explanation for the selected code and diagnostics as paragraphs of text.',
    },
    Review = {
      prompt = '> /COPILOT_REVIEW\n\nReview the selected code.',
      -- see config.lua for implementation
    },
    Fix = {
      prompt = '> /COPILOT_GENERATE\n\nThere is a problem in this code. Rewrite the code to show it with the bug fixed.',
    },
    Optimize = {
      prompt = '> /COPILOT_GENERATE\n\nOptimize the selected code to improve performance and readability.',
    },
    Docs = {
      prompt = '> /COPILOT_GENERATE\n\nPlease add documentation comments to the selected code.',
    },
    Tests = {
      prompt = '> /COPILOT_GENERATE\n\nPlease generate tests for my code.',
    },
    Commit = {
      prompt = '> #git:staged\n\nWrite commit message for the change with commitizen convention. Make sure the title has maximum 50 characters and message is wrapped at 72 characters. Wrap the whole message in code block with language gitcommit.',
    },
  },

  -- default window options
  window = {
    layout = 'vertical', -- 'vertical', 'horizontal', 'float', 'replace'
    width = 0.5, -- fractional width of parent, or absolute width in columns when > 1
    height = 0.5, -- fractional height of parent, or absolute height in rows when > 1
    -- Options below only apply to floating windows
    relative = 'editor', -- 'editor', 'win', 'cursor', 'mouse'
    border = 'single', -- 'none', single', 'double', 'rounded', 'solid', 'shadow'
    row = nil, -- row position of the window, default is centered
    col = nil, -- column position of the window, default is centered
    title = 'Copilot Chat', -- title of chat window
    footer = nil, -- footer of chat window
    zindex = 1, -- determines if window is on top or below other floating windows
  },

  -- default mappings
  mappings = {
    complete = {
      insert ='<Tab>',
    },
    close = {
      normal = 'q',
      insert = '<C-c>'
    },
    reset = {
      normal ='<C-l>',
      insert = '<C-l>'
    },
    submit_prompt = {
      normal = '<CR>',
      insert = '<C-s>'
    },
    toggle_sticky = {
      detail = 'Makes line under cursor sticky or deletes sticky line.',
      normal = 'gr',
    },
    accept_diff = {
      normal = '<C-y>',
      insert = '<C-y>'
    },
    yank_diff = {
      normal = 'gy',
      register = '"',
    },
    show_diff = {
      normal = 'gd'
    },
    show_system_prompt = {
      normal = 'gp'
    },
    show_user_selection = {
      normal = 'gs'
    },
  },
}

For further reference, you can view @jellydn's configuration.

Defining a prompt with command and keymap

This will define prompt that you can reference with /MyCustomPrompt in chat, call with :CopilotChatMyCustomPrompt or use the keymap <leader>ccmc. It will use visual selection as default selection. If you are using lazy.nvim and are already lazy loading based on Commands make sure to include the prompt commands and keymaps in cmd and keys respectively.

{
  prompts = {
    MyCustomPrompt = {
      prompt = 'Explain how it works.',
      mapping = '<leader>ccmc',
      description = 'My custom prompt description',
      selection = require('CopilotChat.select').visual,
    },
  },
}

Referencing system or user prompts

You can reference system or user prompts in your configuration or in chat with /PROMPT_NAME slash notation. For collection of default COPILOT_ (system) and USER_ (user) prompts, see here.

{
  prompts = {
    MyCustomPrompt = {
      prompt = '/COPILOT_EXPLAIN Explain how it works.',
    },
    MyCustomPrompt2 = {
      prompt = '/MyCustomPrompt Include some additional context.',
    },
  },
}

Custom system prompts

You can define custom system prompts by using system_prompt property when passing config around.

{
  system_prompt = 'Your name is Github Copilot and you are a AI assistant for developers.',
  prompts = {
    Johnny = {
      system_prompt = 'Your name is Johny Microsoft and you are not an AI assistant for developers.',
      prompt = 'Explain how it works.',
    },
    Yarrr = {
      system_prompt = 'You are fascinated by pirates, so please respond in pirate speak.'
    },
  },
}

To use any of your custom prompts, simply do :CopilotChat<prompt name>. E.g. :CopilotChatJohnny or :CopilotChatYarrr What is a sorting algo?. Tab autocomplete will help you out.

Customizing buffers

You can set local options for the buffers that are created by this plugin: copilot-diff, copilot-system-prompt, copilot-user-selection, copilot-chat.

vim.api.nvim_create_autocmd('BufEnter', {
    pattern = 'copilot-*',
    callback = function()
        vim.opt_local.relativenumber = true

        -- C-p to print last response
        vim.keymap.set('n', '<C-p>', function()
          print(require("CopilotChat").response())
        end, { buffer = true, remap = true })
    end
})

Tips

Quick chat with your buffer

To chat with Copilot using the entire content of the buffer, you can add the following configuration to your keymap:

-- lazy.nvim keys

  -- Quick chat with Copilot
  {
    "<leader>ccq",
    function()
      local input = vim.fn.input("Quick Chat: ")
      if input ~= "" then
        require("CopilotChat").ask(input, { selection = require("CopilotChat.select").buffer })
      end
    end,
    desc = "CopilotChat - Quick chat",
  }

Chat with buffer

Inline chat

Change the window layout to float and position relative to cursor to make the window look like inline chat. This will allow you to chat with Copilot without opening a new window.

-- lazy.nvim opts

  {
    window = {
      layout = 'float',
      relative = 'cursor',
      width = 1,
      height = 0.4,
      row = 1
    }
  }

inline-chat

Telescope integration

Requires telescope.nvim plugin to be installed.

-- lazy.nvim keys

  -- Show prompts actions with telescope
  {
    "<leader>ccp",
    function()
      local actions = require("CopilotChat.actions")
      require("CopilotChat.integrations.telescope").pick(actions.prompt_actions())
    end,
    desc = "CopilotChat - Prompt actions",
  },

image

fzf-lua integration

Requires fzf-lua plugin to be installed.

-- lazy.nvim keys

  -- Show prompts actions with fzf-lua
  {
    "<leader>ccp",
    function()
      local actions = require("CopilotChat.actions")
      require("CopilotChat.integrations.fzflua").pick(actions.prompt_actions())
    end,
    desc = "CopilotChat - Prompt actions",
  },

image

render-markdown integration

Requires render-markdown plugin to be installed.

-- Registers copilot-chat filetype for markdown rendering
require('render-markdown').setup({
  file_types = { 'markdown', 'copilot-chat' },
})

-- You might also want to disable default header highlighting for copilot chat when doing this
require('CopilotChat').setup({
  highlight_headers = false,
  -- rest of your config
})

image

Roadmap (Wishlist)

  • Use indexed vector database with current workspace for better context selection
  • General QOL improvements

Development

Installing Pre-commit Tool

For development, you can use the provided Makefile command to install the pre-commit tool:

make install-pre-commit

This will install the pre-commit tool and the pre-commit hooks.

Contributors ✨

If you want to contribute to this project, please read the CONTRIBUTING.md file.

Thanks goes to these wonderful people (emoji key):

gptlang
gptlang

💻 📖
Dung Duc Huynh (Kaka)
Dung Duc Huynh (Kaka)

💻 📖
Ahmed Haracic
Ahmed Haracic

💻
Trí Thiện Nguyễn
Trí Thiện Nguyễn

💻
He Zhizhou
He Zhizhou

💻
Guruprakash Rajakkannu
Guruprakash Rajakkannu

💻
kristofka
kristofka

💻
PostCyberPunk
PostCyberPunk

📖
Katsuhiko Nishimra
Katsuhiko Nishimra

💻
Erno Hopearuoho
Erno Hopearuoho

💻
Shaun Garwood
Shaun Garwood

💻
neutrinoA4
neutrinoA4

💻 📖
Jack Muratore
Jack Muratore

💻
Adriel Velazquez
Adriel Velazquez

💻 📖
Tomas Slusny
Tomas Slusny

💻 📖
Nisal
Nisal

📖
Tobias Gårdhus
Tobias Gårdhus

📖
Petr Dlouhý
Petr Dlouhý

📖
Dylan Madisetti
Dylan Madisetti

💻
Aaron Weisberg
Aaron Weisberg

💻 📖
Jose Tlacuilo
Jose Tlacuilo

💻 📖
Kevin Traver
Kevin Traver

💻 📖
dTry
dTry

💻
Arata Furukawa
Arata Furukawa

💻
Ling
Ling

💻
Ivan Frolov
Ivan Frolov

💻
Folke Lemaitre
Folke Lemaitre

💻 📖
GitMurf
GitMurf

💻
Dmitrii Lipin
Dmitrii Lipin

💻
jinzhongjia
jinzhongjia

📖
guill
guill

💻
Sjon-Paul Brown
Sjon-Paul Brown

💻
Renzo Mondragón
Renzo Mondragón

💻 📖
fjchen7
fjchen7

💻
Radosław Woźniak
Radosław Woźniak

💻
JakubPecenka
JakubPecenka

💻
thomastthai
thomastthai

📖
Tomáš Janoušek
Tomáš Janoušek

💻
Toddneal Stallworth
Toddneal Stallworth

📖
Sergey Alexandrov
Sergey Alexandrov

💻

This project follows the all-contributors specification. Contributions of any kind are welcome!

Stargazers over time

Stargazers over time

Releases

No releases published

Packages

No packages published

Languages

  • Lua 98.7%
  • Makefile 1.3%