All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, adheres to Semantic Versioning, and is generated by Changie.
Pre-release Changes
- Support for Gemini 1.5 Pro
- Support for improved retrieval models (Voyage embeddings/reranking)
- New @code context provider
- Personal usage analytics
- Tab-autocomplete in beta
- Image support
- Full-text search index for retrieval
- Docs context provider
- CodeLlama-70b support
- config.ts only runs in NodeJS, not browser
- Fixed proxy setting in config.json
- Add codellama and gemini to free trial, using new server
- Local codebase syncing and embeddings using LanceDB
- Improved VS Code theme matching
- Updates to packaging to download native modules for current platform (lancedb, sqlite, onnxruntime, tree-sitter wasms)
- Context providers now run from the extension side (in Node.js instead of browser javascript)
- disableSessionTitles option in config.json
- Use Ollama /chat endpoint instead of raw completions by default, and /show endpoint to gather model parameters like context length and stop tokens
- support for .continuerc.json in root of workspace to override config.json
- Inline context providers
- cmd+shift+L with new diff streaming UI for edits
- Allow certain LLM servers to handle templating
- Context items are now kept around as a part of past messages, instead of staying at the main input
- No more Python server - Continue runs entirely in Typescript
- migrated to .json config file format
- Full screen mode
- StackOverflow slash command to augment with web search
- VS Code context menus: right click to add code to context, debug the terminal, or share your Continue session
- Reliability improvements to JetBrains by bringing up-to-date with the socket.io refactor
- Codebase Retrieval: Use /codebase or cmd+enter and Continue will automatically gather the most important context
- Switch from Websockets to Socket.io