-
Notifications
You must be signed in to change notification settings - Fork 107
chain not being updated on llm_token_usage callback? #267
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Ah, interesting catch. I’ll have to look at it more, but I suspect when we
get the usage metadata, the message hasn’t been fully processed yet.
…On Mon, Mar 17, 2025 at 6:58 AM Andre Parmeggiani ***@***.***> wrote:
Not sure if expected behaviour, but *last_message.content* has different
values for the example below.
Shouldn't they be the same once usage values refer to both messages?
Thanks!
events = %{
on_llm_token_usage: fn chain, _usage ->
IO.inspect(chain.last_message.content, label: "FROM_CALLBACK" )
end
}
{:ok, updated_chain} = LLMChain.new!( %{llm: ChatOpenAI.new!()} )
|> LLMChain.add_message(Message.new_user!("What's your name?"))
|> LLMChain.add_callback(events)
|> LLMChain.run(mode: :while_needs_response)
IO.inspect(updated_chain.last_message.content, label: "FROM_RUN")
FROM_CALLBACK: "What's your name?"
FROM_RUN: "I am an AI assistant and my name is Assistant. How can I help you today?"
—
Reply to this email directly, view it on GitHub
<#267>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAGFQGBGK4QL7WHYNEYFWBD2U3BINAVCNFSM6AAAAABZFODQZWVHI2DSMVQWIX3LMV43ASLTON2WKOZSHEZDIOJZHEZDIMA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
[image: aaparmeggiani]*aaparmeggiani* created an issue
(brainlid/langchain#267)
<#267>
Not sure if expected behaviour, but *last_message.content* has different
values for the example below.
Shouldn't they be the same once usage values refer to both messages?
Thanks!
events = %{
on_llm_token_usage: fn chain, _usage ->
IO.inspect(chain.last_message.content, label: "FROM_CALLBACK" )
end
}
{:ok, updated_chain} = LLMChain.new!( %{llm: ChatOpenAI.new!()} )
|> LLMChain.add_message(Message.new_user!("What's your name?"))
|> LLMChain.add_callback(events)
|> LLMChain.run(mode: :while_needs_response)
IO.inspect(updated_chain.last_message.content, label: "FROM_RUN")
FROM_CALLBACK: "What's your name?"
FROM_RUN: "I am an AI assistant and my name is Assistant. How can I help you today?"
—
Reply to this email directly, view it on GitHub
<#267>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAGFQGBGK4QL7WHYNEYFWBD2U3BINAVCNFSM6AAAAABZFODQZWVHI2DSMVQWIX3LMV43ASLTON2WKOZSHEZDIOJZHEZDIMA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Thanks for your quick reply, Mark. Yes, that looks fine if you only need the usage metadata, but it becomes tricky if you also need access to the output itself—for instance, when feeding a trace. |
I looked deeper into it. There isn't a good way to combine the two. The token usage data and the message are processed by the ChatOpenAI module. The chain is managed by the LLMChain and it doesn't get updated until after the ChatOpenAI module is done. So the last_message on the chain is not updated at that point. The purpose for making the chain available to the callback was so the If you need to tie the data in the token usage callback together with something of the request, you can perhaps put something in the custom_context that you access in the callback. |
Cheers Mark! |
Reopening for further thoughts. The issue with The solution I've found so far is storing the usage directly in the message, as metadata, eliminating the need for a callback altogether in my use case. aaparmeggiani@3d750a7 |
FYI, message's will include the usage information in Message.metadata.usage. It's being added to MessageDeltas as well and they can be accumulated together. This should provide a much clearer and more direct link of token usage with each request. |
Not sure if expected behaviour, but last_message.content has different values for the example below.
Shouldn't they be the same once usage values refer to both messages?
Thanks!
The text was updated successfully, but these errors were encountered: