Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add PromptRoles config class for llmInterface #11

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

mohamedryan
Copy link

Pull Request Title:
Add PromptRoles config class to LLMInterface for consistent enum handling

Description:
This pull request adds a PromptRoles nested config class to LLMInterface to manage provider-specific enum values.

  • Changes Introduced:
    • Added a PromptRoles nested class to LLMInterface.
    • Each concrete class can override this PromptRoles config to define their respective enum values.
    • Updated the controller logic to access promptRoles via the interface instead of relying on specific implementation details.

Reason for Changes:

  • Previously, each provider defined its own enums, which were accessed directly, leading to inconsistencies.
  • Adding the PromptRoles class to the interface standardizes the access to enums across all providers, ensuring a consistent structure and improving maintainability.

Additional Notes:

  • This change enhances the code's readability and structure, providing a clear and predictable interface for interacting with different providers.

…provider, as there is a dependency on it in NlpController.
@mohamedryan mohamedryan changed the title Add PromptRoles config class for llmInterface, override it from each … Add PromptRoles config class for llmInterface Oct 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant