Skip to content

Commit

Permalink
chore(langchain): update imports and examples
Browse files Browse the repository at this point in the history
Signed-off-by: Tomas Dvorak <[email protected]>
  • Loading branch information
Tomas2D committed Nov 10, 2023
1 parent 49153ea commit 0dc832f
Show file tree
Hide file tree
Showing 3 changed files with 18 additions and 17 deletions.
7 changes: 4 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -243,6 +243,7 @@ await model.call('Tell me a joke.', undefined, [

```typescript
import { GenAIChatModel } from '@ibm-generative-ai/node-sdk/langchain';
import { SystemMessage, HumanMessage } from 'langchain/schema';

const client = new GenAIChatModel({
modelId: 'eleutherai/gpt-neox-20b',
Expand All @@ -268,13 +269,13 @@ const client = new GenAIChatModel({
});

const response = await client.call([
new SystemChatMessage(
new SystemMessage(
'You are a helpful assistant that translates English to Spanish.',
),
new HumanChatMessage('I love programming.'),
new HumanMessage('I love programming.'),
]);

console.info(response.text); // "Me encanta la programación."
console.info(response.content); // "Me encanta la programación."
```

#### Prompt Templates (GenAI x LangChain)
Expand Down
2 changes: 1 addition & 1 deletion src/langchain/llm-chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ export class GenAIChatModel extends BaseChatModel {
`Unsupported message type "${msg._getType()}"`,
);
}
return `${type.stopSequence}${msg.text}`;
return `${type.stopSequence}${msg.content}`;
})
.join('\n')
.concat(this.#rolesMapping.system.stopSequence);
Expand Down
26 changes: 13 additions & 13 deletions src/tests/e2e/langchain/llm-chat.test.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { HumanChatMessage, SystemChatMessage } from 'langchain/schema';
import { HumanMessage, SystemMessage } from 'langchain/schema';

import { GenAIChatModel } from '../../../langchain/index.js';
import { describeIf } from '../../utils.js';
Expand Down Expand Up @@ -47,34 +47,34 @@ describeIf(process.env.RUN_LANGCHAIN_CHAT_TESTS === 'true')(
const chat = makeClient();

const response = await chat.call([
new HumanChatMessage(
new HumanMessage(
'What is a good name for a company that makes colorful socks?',
),
]);
expectIsNonEmptyString(response.text);
expectIsNonEmptyString(response.content);
});

test('should handle question with additional hint', async () => {
const chat = makeClient();

const response = await chat.call([
new SystemChatMessage(SYSTEM_MESSAGE),
new HumanChatMessage('I love programming.'),
new SystemMessage(SYSTEM_MESSAGE),
new HumanMessage('I love programming.'),
]);
expectIsNonEmptyString(response.text);
expectIsNonEmptyString(response.content);
});

test('should handle multiple questions', async () => {
const chat = makeClient();

const response = await chat.generate([
[
new SystemChatMessage(SYSTEM_MESSAGE),
new HumanChatMessage('I love programming.'),
new SystemMessage(SYSTEM_MESSAGE),
new HumanMessage('I love programming.'),
],
[
new SystemChatMessage(SYSTEM_MESSAGE),
new HumanChatMessage('I love artificial intelligence.'),
new SystemMessage(SYSTEM_MESSAGE),
new HumanMessage('I love artificial intelligence.'),
],
]);

Expand All @@ -95,7 +95,7 @@ describeIf(process.env.RUN_LANGCHAIN_CHAT_TESTS === 'true')(
});

const output = await chat.call(
[new HumanChatMessage('Tell me a joke.')],
[new HumanMessage('Tell me a joke.')],
undefined,
[
{
Expand All @@ -105,8 +105,8 @@ describeIf(process.env.RUN_LANGCHAIN_CHAT_TESTS === 'true')(
);

expect(handleNewToken).toHaveBeenCalled();
expectIsNonEmptyString(output.text);
expect(tokens.join('')).toStrictEqual(output.text);
expectIsNonEmptyString(output.content);
expect(tokens.join('')).toStrictEqual(output.content);
});
});
},
Expand Down

0 comments on commit 0dc832f

Please sign in to comment.