Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(deepseek): Adds Deepseek integration #7604

Merged
merged 7 commits into from
Jan 28, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
312 changes: 312 additions & 0 deletions docs/core_docs/docs/integrations/chat/deepseek.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,312 @@
{
"cells": [
{
"cell_type": "raw",
"id": "afaf8039",
"metadata": {
"vscode": {
"languageId": "raw"
}
},
"source": [
"---\n",
"sidebar_label: DeepSeek\n",
"---"
]
},
{
"cell_type": "markdown",
"id": "e49f1e0d",
"metadata": {},
"source": [
"# ChatDeepSeek\n",
"\n",
"This will help you getting started with DeepSeek [chat models](/docs/concepts/#chat-models). For detailed documentation of all `ChatDeepSeek` features and configurations head to the [API reference](https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepSeek.html).\n",
"\n",
"## Overview\n",
"### Integration details\n",
"\n",
"| Class | Package | Local | Serializable | [PY support](https://python.langchain.com/docs/integrations/chat/deepseek) | Package downloads | Package latest |\n",
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
"| [`ChatDeepSeek`](https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepSeek.html) | [`@langchain/deepseek`](https://npmjs.com/@langchain/deepseek) | ❌ (see [Ollama](/docs/integrations/chat/ollama)) | beta | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/deepseek?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/deepseek?style=flat-square&label=%20&) |\n",
"\n",
"### Model features\n",
"\n",
"See the links in the table headers below for guides on how to use specific features.\n",
"\n",
"| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
"| ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | \n",
"\n",
"Note that as of 1/27/25, tool calling and structured output are not currently supported for `deepseek-reasoner`.\n",
"\n",
"## Setup\n",
"\n",
"To access DeepSeek models you'll need to create a DeepSeek account, get an API key, and install the `@langchain/deepseek` integration package.\n",
"\n",
"You can also access the DeepSeek API through providers like [Together AI](/docs/integrations/chat/togetherai) or [Ollama](/docs/integrations/chat/ollama).\n",
"\n",
"### Credentials\n",
"\n",
"Head to https://deepseek.com/ to sign up to DeepSeek and generate an API key. Once you've done this set the `DEEPSEEK_API_KEY` environment variable:\n",
"\n",
"```bash\n",
"export DEEPSEEK_API_KEY=\"your-api-key\"\n",
"```\n",
"\n",
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:\n",
"\n",
"```bash\n",
"# export LANGSMITH_TRACING=\"true\"\n",
"# export LANGSMITH_API_KEY=\"your-api-key\"\n",
"```\n",
"\n",
"### Installation\n",
"\n",
"The LangChain ChatDeepSeek integration lives in the `@langchain/deepseek` package:\n",
"\n",
"```{=mdx}\n",
"import IntegrationInstallTooltip from \"@mdx_components/integration_install_tooltip.mdx\";\n",
"import Npm2Yarn from \"@theme/Npm2Yarn\";\n",
"\n",
"<IntegrationInstallTooltip></IntegrationInstallTooltip>\n",
"\n",
"<Npm2Yarn>\n",
" @langchain/deepseek @langchain/core\n",
"</Npm2Yarn>\n",
"\n",
"```"
]
},
{
"cell_type": "markdown",
"id": "a38cde65-254d-4219-a441-068766c0d4b5",
"metadata": {},
"source": [
"## Instantiation\n",
"\n",
"Now we can instantiate our model object and generate chat completions:"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae",
"metadata": {},
"outputs": [],
"source": [
"import { ChatDeepSeek } from \"@langchain/deepseek\";\n",
"\n",
"const llm = new ChatDeepSeek({\n",
" model: \"deepseek-reasoner\",\n",
" temperature: 0,\n",
" // other params...\n",
"})"
]
},
{
"cell_type": "markdown",
"id": "2b4f3e15",
"metadata": {},
"source": [
"<!-- ## Invocation -->"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "62e0dbc3",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"AIMessage {\n",
" \"id\": \"e2874482-68a7-4552-8154-b6a245bab429\",\n",
" \"content\": \"J'adore la programmation.\",\n",
" \"additional_kwargs\": {,\n",
" \"reasoning_content\": \"...\",\n",
" },\n",
" \"response_metadata\": {\n",
" \"tokenUsage\": {\n",
" \"promptTokens\": 23,\n",
" \"completionTokens\": 7,\n",
" \"totalTokens\": 30\n",
" },\n",
" \"finish_reason\": \"stop\",\n",
" \"model_name\": \"deepseek-reasoner\",\n",
" \"usage\": {\n",
" \"prompt_tokens\": 23,\n",
" \"completion_tokens\": 7,\n",
" \"total_tokens\": 30,\n",
" \"prompt_tokens_details\": {\n",
" \"cached_tokens\": 0\n",
" },\n",
" \"prompt_cache_hit_tokens\": 0,\n",
" \"prompt_cache_miss_tokens\": 23\n",
" },\n",
" \"system_fingerprint\": \"fp_3a5770e1b4\"\n",
" },\n",
" \"tool_calls\": [],\n",
" \"invalid_tool_calls\": [],\n",
" \"usage_metadata\": {\n",
" \"output_tokens\": 7,\n",
" \"input_tokens\": 23,\n",
" \"total_tokens\": 30,\n",
" \"input_token_details\": {\n",
" \"cache_read\": 0\n",
" },\n",
" \"output_token_details\": {}\n",
" }\n",
"}\n"
]
}
],
"source": [
"const aiMsg = await llm.invoke([\n",
" [\n",
" \"system\",\n",
" \"You are a helpful assistant that translates English to French. Translate the user sentence.\",\n",
" ],\n",
" [\"human\", \"I love programming.\"],\n",
"])\n",
"aiMsg"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "d86145b3-bfef-46e8-b227-4dda5c9c2705",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"J'adore la programmation.\n"
]
}
],
"source": [
"console.log(aiMsg.content)"
]
},
{
"cell_type": "markdown",
"id": "18e2bfc0-7e78-4528-a73f-499ac150dca8",
"metadata": {},
"source": [
"## Chaining\n",
"\n",
"We can [chain](/docs/how_to/sequence/) our model with a prompt template like so:"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "e197d1d7-a070-4c96-9f8a-a0e86d046e0b",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"AIMessage {\n",
" \"id\": \"6e7f6f8c-8d7a-4dad-be07-425384038fd4\",\n",
" \"content\": \"Ich liebe es zu programmieren.\",\n",
" \"additional_kwargs\": {,\n",
" \"reasoning_content\": \"...\",\n",
" },\n",
" \"response_metadata\": {\n",
" \"tokenUsage\": {\n",
" \"promptTokens\": 18,\n",
" \"completionTokens\": 9,\n",
" \"totalTokens\": 27\n",
" },\n",
" \"finish_reason\": \"stop\",\n",
" \"model_name\": \"deepseek-reasoner\",\n",
" \"usage\": {\n",
" \"prompt_tokens\": 18,\n",
" \"completion_tokens\": 9,\n",
" \"total_tokens\": 27,\n",
" \"prompt_tokens_details\": {\n",
" \"cached_tokens\": 0\n",
" },\n",
" \"prompt_cache_hit_tokens\": 0,\n",
" \"prompt_cache_miss_tokens\": 18\n",
" },\n",
" \"system_fingerprint\": \"fp_3a5770e1b4\"\n",
" },\n",
" \"tool_calls\": [],\n",
" \"invalid_tool_calls\": [],\n",
" \"usage_metadata\": {\n",
" \"output_tokens\": 9,\n",
" \"input_tokens\": 18,\n",
" \"total_tokens\": 27,\n",
" \"input_token_details\": {\n",
" \"cache_read\": 0\n",
" },\n",
" \"output_token_details\": {}\n",
" }\n",
"}\n"
]
}
],
"source": [
"import { ChatPromptTemplate } from \"@langchain/core/prompts\"\n",
"\n",
"const prompt = ChatPromptTemplate.fromMessages(\n",
" [\n",
" [\n",
" \"system\",\n",
" \"You are a helpful assistant that translates {input_language} to {output_language}.\",\n",
" ],\n",
" [\"human\", \"{input}\"],\n",
" ]\n",
")\n",
"\n",
"const chain = prompt.pipe(llm);\n",
"await chain.invoke(\n",
" {\n",
" input_language: \"English\",\n",
" output_language: \"German\",\n",
" input: \"I love programming.\",\n",
" }\n",
")"
]
},
{
"cell_type": "markdown",
"id": "3a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
"metadata": {},
"source": [
"## API reference\n",
"\n",
"For detailed documentation of all ChatDeepSeek features and configurations head to the API reference: https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepSeek.html"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "TypeScript",
"language": "typescript",
"name": "tslab"
},
"language_info": {
"codemirror_mode": {
"mode": "typescript",
"name": "javascript",
"typescript": true
},
"file_extension": ".ts",
"mimetype": "text/typescript",
"name": "typescript",
"version": "3.7.2"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
2 changes: 1 addition & 1 deletion docs/core_docs/docs/integrations/chat/xai.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
"\n",
"[xAI](https://x.ai/) is an artificial intelligence company that develops large language models (LLMs). Their flagship model, Grok, is trained on real-time X (formerly Twitter) data and aims to provide witty, personality-rich responses while maintaining high capability on technical tasks.\n",
"\n",
"This guide will help you getting started with `ChatXAI` [chat models](/docs/concepts/chat_models). For detailed documentation of all `ChatXAI` features and configurations head to the [API reference](https://api.js.langchain.com/classes/langchain_community_chat_models_fireworks.ChatXAI.html).\n",
"This guide will help you getting started with `ChatXAI` [chat models](/docs/concepts/chat_models). For detailed documentation of all `ChatXAI` features and configurations head to the [API reference](https://api.js.langchain.com/classes/_langchain_xai.ChatXAI.html).\n",
"\n",
"## Overview\n",
"### Integration details\n",
Expand Down
1 change: 1 addition & 0 deletions examples/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@
"@langchain/cohere": "workspace:*",
"@langchain/community": "workspace:*",
"@langchain/core": "workspace:*",
"@langchain/deepseek": "workspace:*",
"@langchain/exa": "workspace:*",
"@langchain/google-common": "workspace:*",
"@langchain/google-genai": "workspace:*",
Expand Down
1 change: 1 addition & 0 deletions libs/langchain-deepseek/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
DEEPSEEK_API_KEY="your_key"
Loading
Loading