A VS Code extension that integrates Ollama's DeepSeek model for local AI assistance. This extension provides a chat interface within VS Code to interact with the DeepSeek language model running locally on your machine.
- 🚀 Local AI processing - all operations run on your machine
- 💻 Native VS Code integration
- 🔒 Privacy-focused - no data sent to external servers
- ⚡ Real-time streaming responses
- 📋 One-click copy functionality
- 🎨 Adaptive theme support - matches your VS Code theme
- ⌨️ Keyboard shortcuts for efficient interaction
2025-02-03.17.40.30.mov
Before using this extension, please ensure you have:
- Ollama installed on your system
- DeepSeek model pulled locally:
ollama pull deepseek-coder:7b
- Install the extension from VS Code Marketplace
- Ensure Ollama is running on your system
- Open Command Palette (Ctrl+Shift+P / Cmd+Shift+P)
- Type "Deep Seek Chat" to start using the extension
- Open the chat interface using Command Palette or the keyboard shortcut
- Type your question in the input field
- Press Enter or click "Send Message" to get a response
- Use the copy button to copy the response to clipboard
- Use Shift+Enter for new lines in the input field
This extension contributes the following settings:
ollamaDeepseek.model
: Choose the DeepSeek model variant (default: "deepseek-r1:7b")ollamaDeepseek.maxTokens
: Maximum tokens in response (default: 2048)
- The extension requires Ollama to be running locally
- Initial model loading might take a few seconds
- Responses are currently limited to English language
- Initial release
- Basic chat functionality
- Copy to clipboard feature
- Theme-aware styling
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
Enjoy coding with your local AI assistant!