Skip to content

Token cost change when used with cached inference #5846

Answered by ekzhu
irf-rox asked this question in Q&A
Discussion options

You must be logged in to vote

We stopped updating cost tracking and the cost is likely way off and unreliable now. Please use the latest version, which focuses on tracking token usage in model clients.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@irf-rox
Comment options

Answer selected by irf-rox
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
0.2 Issues which are related to the pre 0.4 codebase
2 participants