Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Healthcheck fix when disabled, UI bug fixes #242

Merged
merged 7 commits into from
Feb 29, 2024
Merged

Healthcheck fix when disabled, UI bug fixes #242

merged 7 commits into from
Feb 29, 2024

Conversation

gerboland
Copy link
Contributor

@gerboland gerboland commented Feb 27, 2024

This is a bunch of bug fixes:

  1. if LLMs was disabled in the plugin config, the health check could still return a enabled LLM if a previous OpenAI API key was provided. The default state in the loadSettings() was the culprit.
  2. Fix for bug Config UI: changing LLM option & save, then switch tab away & back, old option still selected #241. The plugin relies on the jsonData provided to it by Grafana to know its current state (read from the backend on plugin initialisation). If the config page changes the configuration, the plugin jsonData remains unchanged until a page reload happens. Fixed by pushing the new jsonData to the plugin on save.
  3. Removal of some state duplication in LLMConfig (setLLMOption state was unnecessary)
  4. Clear the health check results if user changes any setting.
  5. I tightened up the health check (empty api key only permitted if llms disabled or grafana-managed).

@gerboland gerboland marked this pull request as draft February 27, 2024 13:14
@gerboland gerboland changed the title Reload when saving state that impacts jsonData Healthcheck fix when disabled, UI bug fixes Feb 28, 2024
@gerboland gerboland marked this pull request as ready for review February 28, 2024 12:45
Copy link
Contributor

@edwardcqian edwardcqian left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested cases 1 and 2.

Not sure how to test 4.

5 might cause issues with self-hosted openai compatible service that doesn't require an api key, but maybe we want a separate way to handle those anyways.

@gerboland
Copy link
Contributor Author

Not sure how to test 4.

Select the own-OpenUI option, hit Save & Test - the health check result should appear. Then click the managed-LLM option. The health check result should vanish.

5 might cause issues with self-hosted openai compatible service that doesn't require an api key, but maybe we want a separate way to handle those anyways.

Yep, that's a whole different problem.

@gerboland gerboland merged commit 286154b into main Feb 29, 2024
3 checks passed
@gerboland gerboland deleted the ui-fix branch February 29, 2024 14:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants