When using embeddings, the `total_tokens` count of a callback is wrong, e.g. the following example currently returns `0` even though it shouldn't: ```python from langchain.callbacks import get_openai_callback with get_openai_callback() as cb: embeddings = OpenAIEmbeddings() embeddings.embed_query("helo") print(cb.total_tokens) ``` IMO this is confusing (and there is no way to get the cost from the embeddings class at the moment).