Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update ollama.mdx #1323

Closed
wants to merge 1 commit into from
Closed

Update ollama.mdx #1323

wants to merge 1 commit into from

Conversation

hfnhf
Copy link

@hfnhf hfnhf commented Jan 13, 2025

update one of two: add quotes around model name on

throws an error about base_url if the next line for model has no quotes around the model

image

update two of two:

image

change example model to a nice fully open model
https://ollama.com/library/[tulu3](https://ollama.com/library/tulu3)

end result:
no such error message about base_url (BAML 0.72.0 open-vsx extension is installed, screenshot to show)
image


Important

Fixes base_url error by quoting model name and updates example model to tulu3 in ollama.mdx.

  • Behavior:
    • Adds quotes around the model name tulu3 in ollama.mdx to prevent a base_url error.
    • Changes example model from llama3 to tulu3 in ollama.mdx for a fully open model.

This description was created by Ellipsis for 07e0fd3. It will automatically update as commits are pushed.

throws an error about base_url if the next line for model has no quotes around the model
Copy link

vercel bot commented Jan 13, 2025

@hfnhf is attempting to deploy a commit to the Gloo Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 Looks good to me! Reviewed everything up to 07e0fd3 in 7 seconds

More details
  • Looked at 13 lines of code in 1 files
  • Skipped 0 files when reviewing.
  • Skipped posting 1 drafted comments based on config settings.
1. fern/03-reference/baml/clients/providers/ollama.mdx:22
  • Draft comment:
    The addition of quotes around the model name is correct and necessary to prevent syntax errors related to the base_url. This change ensures the model name is interpreted as a string.
  • Reason this comment was not posted:
    Confidence changes required: 0%
    The PR correctly adds quotes around the model name, which is necessary for proper syntax in this context. The change aligns with the requirement to avoid errors related to base_url.

Workflow ID: wflow_JcT5e70memrIXu54


You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.

@hfnhf
Copy link
Author

hfnhf commented Jan 13, 2025

image
I'm still getting base_url error in the playground, even if the clients.baml doesn't show base_url error anymore, need to close while investigating what's actually going on

@hfnhf hfnhf closed this Jan 13, 2025
@aaronvg
Copy link
Contributor

aaronvg commented Jan 13, 2025

hmm this one works for me for example:

client<llm> Llama3 {
  // See https://docs.boundaryml.com/docs/snippets/clients/providers/ollama
  // to learn more about how to configure this client
  //
  // Note that you should run ollama using `OLLAMA_ORIGINS='*' ollama serve`
  // and you'll also need to `ollama pull llama3` to use this client
  provider openai-generic
  options {
    base_url "http://localhost:11434/v1"
    model "llama3.2:1b"
  }
}

@aaronvg
Copy link
Contributor

aaronvg commented Jan 13, 2025

your client in your function should say "CustomOllama"

@hfnhf
Copy link
Author

hfnhf commented Jan 13, 2025

Thank you aaronvg sir, I also was able to run tests

image
with resume.baml containing
...

function ExtractResume(resume: string) -> Resume {
  // Specify a client as provider/model-name
  // you can use custom LLM params with a custom client name from clients.baml like "client CustomHaiku"
  **client "CustomOllama"** 
  prompt #"
    
    {{ ctx.output_format }}
    Extract from this content:
    {{ resume }}

  "#
}

...

and clients.baml containing

client<llm> CustomOllama {
  provider "ollama"
  options {
    base_url "http://localhost:11434/v1"
    model "3thtulu3"
  }
}

I will wait before myself submitting any other pull request until after myself understanding the full context of the rest of the documentation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants