Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Last chat delta receive from on_llm_new_delta handler does not change to complete status #221

Open
seelowbei opened this issue Dec 26, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@seelowbei
Copy link

seelowbei commented Dec 26, 2024

I'm using LLMChain with release fromv0.3.0-rc.1 for LiveView where function is run in an async process. Currently, my code is working well for ChatOpenAI model. However, when I switch model to ChatGoogleAI, the last chat delta receive from the on_llm_new_delta handler would still in incomplete status and does not change to complete status. Can anyone help on this issue? Thanks.

 def handle_params(_params, _uri, socket) do
    socket =
      socket
      |> assign(
        :chain,
        LLMChain.new!(%{
          llm:
            ChatGoogleAI.new!(%{
              model: "gemini-2.0-flash-exp",
              temperature: 0,
              request_timeout: 60_000,
              stream: true
            }),
          verbose: false
        })
      )
      |> assign(:async_result, %AsyncResult{})

    {:noreply, socket}
  end

defp run_chain(socket) do
    live_view_pid = self()

    chain_handler = %{
      on_tool_response_created: fn _chain, %Message{} = tool_message ->
        send(live_view_pid, {:tool_executed, tool_message})
      end
    }

    model_handler = %{
      on_llm_new_delta: fn _model, delta ->
        send(live_view_pid, {:chat_delta, delta})
      end
    }

    chain =
      socket.assigns.llm_chain
      |> LLMChain.add_callback(chain_handler)
      |> LLMChain.add_llm_callback(model_handler)

    socket
    |> assign(:async_result, AsyncResult.loading())
    |> start_async(:running_llm, fn ->
      case LLMChain.run(chain, mode: :while_needs_response) do
        {:ok, _updated_chain} -> :ok
        {:error, _updated_chain, %LangChain.LangChainError{message: reason}} -> {:error, reason}
      end
    end)
  end

  def handle_info({:chat_delta, %LangChain.MessageDelta{} = delta}, socket) do
    updated_chain = LLMChain.apply_delta(socket.assigns.llm_chain, delta)

    {:noreply, assign(socket, :llm_chain, updated_chain)}
  end
@brainlid
Copy link
Owner

brainlid commented Jan 4, 2025

This sounds like a bug. I don't regularly use ChatGoogleAI. Is anyone else able to jump in who is using it?

@brainlid brainlid added the bug Something isn't working label Jan 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants