You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Include token estimation and corresponding ecological implications for all requests.
We 8 different libraries (gliner, ollama, outlines, transformers, instructor, langchain, vllm , dspy) to convert requests into structured data. We need a way to count input and output tokens across all possible kinds of models w.r.t. to the model used, and store them in a unified format.
I haven't looked into this too much yet. The annoying part about this is to find an easy and not too verbose way to make model- independent token estimations. Once we have token count estimations, we can think about which other stats we want to compute.
Hoping that we can integrate one of the existing libraries (EcoLogits or whatever else, although the way that EcoLogits places itself in the call stack won't work for us) to do the heavy lifting for us.
The text was updated successfully, but these errors were encountered:
Uh oh!
There was an error while loading. Please reload this page.
Include token estimation and corresponding ecological implications for all requests.
We 8 different libraries (gliner, ollama, outlines, transformers, instructor, langchain, vllm , dspy) to convert requests into structured data. We need a way to count input and output tokens across all possible kinds of models w.r.t. to the model used, and store them in a unified format.
A corresponding call looks e.g. like this.
I haven't looked into this too much yet. The annoying part about this is to find an easy and not too verbose way to make model- independent token estimations. Once we have token count estimations, we can think about which other stats we want to compute.
Hoping that we can integrate one of the existing libraries (EcoLogits or whatever else, although the way that EcoLogits places itself in the call stack won't work for us) to do the heavy lifting for us.
The text was updated successfully, but these errors were encountered: