Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix o1-preview model support in Azure OpenAI #904

Closed
wants to merge 18 commits into from

Conversation

henryperkins
Copy link

@henryperkins henryperkins commented Dec 7, 2024

Related to #903

Update code to support Azure OpenAI o1-preview model with maxCompletionTokens property.

  • Add maxCompletionTokens property to ModelConfig interface in src/aiParams.ts.
  • Update handleOpenAIExtraArgs method in src/LLMProviders/chatModelManager.ts to set maxCompletionTokens property for o1-preview model.
  • Pass maxCompletionTokens property to ChatOpenAI constructor in getModelConfig method in src/LLMProviders/chatModelManager.ts.
  • Update createChainWithNewModel and setChain methods in src/LLMProviders/chainManager.ts to handle o1-preview model.
  • Add specific mention of o1-preview model support in Azure OpenAI section in README.md.
  • Add instructions for setting up o1-preview model in Azure OpenAI section in src/settings/components/ApiSettings.tsx.

For more details, open the Copilot Workspace session.

Related to #903

Update code to support Azure OpenAI `o1-preview` model with `maxCompletionTokens` property.

* Add `maxCompletionTokens` property to `ModelConfig` interface in `src/aiParams.ts`.
* Update `handleOpenAIExtraArgs` method in `src/LLMProviders/chatModelManager.ts` to set `maxCompletionTokens` property for `o1-preview` model.
* Pass `maxCompletionTokens` property to `ChatOpenAI` constructor in `getModelConfig` method in `src/LLMProviders/chatModelManager.ts`.
* Update `createChainWithNewModel` and `setChain` methods in `src/LLMProviders/chainManager.ts` to handle `o1-preview` model.
* Add specific mention of `o1-preview` model support in Azure OpenAI section in `README.md`.
* Add instructions for setting up `o1-preview` model in Azure OpenAI section in `src/settings/components/ApiSettings.tsx`.

---

For more details, open the [Copilot Workspace session](https://copilot-workspace.githubnext.com/logancyang/obsidian-copilot/issues/903?shareId=XXXX-XXXX-XXXX-XXXX).
* **src/aiParams.ts**: Add `maxCompletionTokens` property to `ModelConfig` interface.
* **src/LLMProviders/chainManager.ts**: Update `createChainWithNewModel` and `setChain` methods to handle `o1-preview` model.
* **README.md**: Add mention of `o1-preview` model support in Azure OpenAI section.
* **src/settings/components/ApiSettings.tsx**: Add instructions for setting up `o1-preview` model in Azure OpenAI section.
* **src/aiParams.ts**
  - Add `maxCompletionTokens` property to `ModelConfig` interface

* **src/LLMProviders/chainManager.ts**
  - Update `setChain` method to accept `SetChainOptions`
  - Add `getEffectivePrompt` method to handle `o1-preview` model prompt

* **README.md**
  - Add instructions for setting up `o1-preview` model in Azure OpenAI section

* **src/settings/components/ApiSettings.tsx**
  - Add specific instructions for `o1-preview` model configuration in Azure OpenAI section
@logancyang
Copy link
Owner

Looks like code duplication with #850

Copy link
Owner

@logancyang logancyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The change is ok but has duplication with existing parts of the file. Please reorganize and clean it up before merge.

…axTokens` and `temperature`

* **src/aiParams.ts**
  - Add `maxCompletionTokens` as an optional parameter in `ModelConfig` interface
  - Ensure `temperature` is set to 1 for `o1-preview` models in `ModelConfig` interface
  - Add `azureOpenAIApiDeploymentName` as an optional parameter in `CustomModel` interface

* **src/LLMProviders/chatModelManager.ts**
  - Add validation for `maxTokens` and `temperature` in `getModelConfig` method
  - Add `handleAzureOpenAIExtraArgs` method to handle Azure OpenAI extra arguments
  - Update Azure OpenAI configuration in `getModelConfig` method

* **src/settings/components/ApiSettings.tsx**
  - Add specific instructions for setting up `o1-preview` model in Azure OpenAI section
@henryperkins
Copy link
Author

Still running into connection error issues..

…am to exclude streaming logic for the `o1-preview` model

* **chainManager.ts**
  - Update `createChainWithNewModel` to handle `o1-preview` model settings
  - Modify `setChain` to exclude streaming logic for `o1-preview` model

* **chatModelManager.ts**
  - Update `getModelConfig` to set `streaming` property to `false` for `o1-preview` model
  - Validate `maxTokens` and `temperature` settings

* **chainRunner.ts**
  - Update `handleResponse` and `handleRetrievalResponse` to exclude streaming logic for `o1-preview` model

* **langchainStream.ts**
  - Add `streamLangChain` and `streamMultimodal` functions to handle streaming logic
  - Exclude streaming logic for `o1-preview` model
Added azureOpenAIApiInstanceName and azureOpenAIApiVersion properties to the CustomModel interface to resolve TypeScript errors related to missing properties.

src/LLMProviders/chatModelManager.ts:

Corrected import paths to use relative paths, ensuring the correct modules are imported.
Handled the openAIOrgId property separately by adding it to the configuration object only when necessary, using a type assertion to bypass TypeScript's type checking for this specific property.
Ensured all parameters have explicit types to avoid implicit any type errors.
src/aiParams.ts Outdated
@@ -73,6 +67,7 @@ export interface CustomModel {
isBuiltIn?: boolean;
enableCors?: boolean;
core?: boolean;
azureOpenAIApiDeploymentName?: string; // Added for Azure OpenAI models
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't this the same as the one above?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm just trying to remove the last 2 commits I made - struggling.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But to answer your question, currently when choosing custom model, you are not given a choice of selecting a model with your deployment ID. I might be wrong, but my testing has made me believe that several gpt-4o models are available on one azure openai deployment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants