Skip to content

Commit

Permalink
add tests and docs
Browse files Browse the repository at this point in the history
  • Loading branch information
tomfrenken committed Jan 9, 2025
1 parent 27a3350 commit 5351267
Show file tree
Hide file tree
Showing 7 changed files with 19 additions and 66 deletions.
9 changes: 5 additions & 4 deletions packages/orchestration/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -175,9 +175,10 @@ Abort controller can be useful, e.g., when end-user wants to stop the stream or
#### Stream Options

The orchestration service offers multiple streaming options, which you can configure in addition to the LLM's streaming options.
There are two ways to add specific streaming options to your client, either at initalization, or dynamically when calling the stream API.
These include options like definining the maximum number of characters per chunk or modifying the output filter behavior.
There are two ways to add specific streaming options to your client, either at initialization of orchestration client, or when calling the stream API.

Dynamically setting these options after client initialization is particularly helpful when you've initialized a client with a config meant for regular chat completion and now want to switch to using streaming.
Setting streaming options dynamically could be useful if an initialized orchestration client will also be used for streaming.

You can check the list of available stream options in the [orchestration service's documentation](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/streaming).

Expand All @@ -197,8 +198,8 @@ const response = orchestrationClient.stream(
);
```

Usage metrics are collected by default, if you do not want to receive them, set include_usage to false.
If you don't want any streaming options as part of your call to the LLM, set options.llm = null.
Usage metrics are collected by default, if you do not want to receive them, set `include_usage` to `false`.
If you don't want any streaming options as part of your call to the LLM, set `streamOptions.llm` to `null`.

> [!NOTE]

Check warning on line 204 in packages/orchestration/README.md

View workflow job for this annotation

GitHub Actions / grammar-check

[vale] reported by reviewdog 🐶 [SAP.Spacing] '!N' should have one space. Raw Output: {"message": "[SAP.Spacing] '!N' should have one space.", "location": {"path": "packages/orchestration/README.md", "range": {"start": {"line": 204, "column": 4}}}, "severity": "WARNING"}
> When initalizing a client with a JSON module config, providing streaming options is not possible.
Expand Down
2 changes: 2 additions & 0 deletions packages/orchestration/src/orchestration-client.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -435,4 +435,6 @@ describe('orchestration service client', () => {
'Could not parse JSON'
);
});

// add test for executing streaming with options with a JSON client, check for warning log<f
});
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,7 @@ import { OrchestrationStreamChunkResponse } from './orchestration-stream-chunk-r

describe('Orchestration chat completion stream chunk response', () => {
let mockResponses: {
tokenUsageResponse: any;
finishReasonResponse: any;
tokenUsageAndFinishReasonResponse: any;
deltaContentResponse: any;
};
let orchestrationStreamChunkResponses: {
Expand All @@ -15,13 +14,9 @@ describe('Orchestration chat completion stream chunk response', () => {

beforeAll(async () => {
mockResponses = {
tokenUsageResponse: await parseMockResponse<any>(
tokenUsageAndFinishReasonResponse: await parseMockResponse<any>(
'orchestration',
'orchestration-chat-completion-stream-chunk-response-token-usage.json'
),
finishReasonResponse: await parseMockResponse<any>(
'orchestration',
'orchestration-chat-completion-stream-chunk-response-finish-reason.json'
'orchestration-chat-completion-stream-chunk-response-token-usage-and-finish-reason.json'
),
deltaContentResponse: await parseMockResponse<any>(
'orchestration',
Expand All @@ -30,10 +25,10 @@ describe('Orchestration chat completion stream chunk response', () => {
};
orchestrationStreamChunkResponses = {
tokenUsageResponse: new OrchestrationStreamChunkResponse(
mockResponses.tokenUsageResponse
mockResponses.tokenUsageAndFinishReasonResponse
),
finishReasonResponse: new OrchestrationStreamChunkResponse(
mockResponses.finishReasonResponse
mockResponses.tokenUsageAndFinishReasonResponse
),
deltaContentResponse: new OrchestrationStreamChunkResponse(
mockResponses.deltaContentResponse
Expand All @@ -44,10 +39,10 @@ describe('Orchestration chat completion stream chunk response', () => {
it('should return the chat completion stream chunk response', () => {
expect(
orchestrationStreamChunkResponses.tokenUsageResponse.data
).toStrictEqual(mockResponses.tokenUsageResponse);
).toStrictEqual(mockResponses.tokenUsageAndFinishReasonResponse);
expect(
orchestrationStreamChunkResponses.finishReasonResponse.data
).toStrictEqual(mockResponses.finishReasonResponse);
).toStrictEqual(mockResponses.tokenUsageAndFinishReasonResponse);
expect(
orchestrationStreamChunkResponses.deltaContentResponse.data
).toStrictEqual(mockResponses.deltaContentResponse);
Expand Down
4 changes: 4 additions & 0 deletions packages/orchestration/src/orchestration-utils.temp.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
// create complete config
// test with config + addStreamOptionsToLlmModuleConfig
// test with config + addStreamOptionsToOutputFilteringConfig
// test complete flo with addStreamOptions
2 changes: 1 addition & 1 deletion packages/orchestration/tsconfig.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"tsBuildInfoFile": "./dist/.tsbuildinfo",
"composite": true
},
"include": ["src/**/*.ts"],
"include": ["src/**/*.ts", "src/orchestration-utils.temp.ts"],
"exclude": ["dist/**/*", "test/**/*", "**/*.test.ts", "node_modules/**/*"],
"references": [{ "path": "../core" }, { "path": "../ai-api" }]
}

This file was deleted.

0 comments on commit 5351267

Please sign in to comment.