You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
some error log for response_cost : Exception: This model isn't mapped yet. model=whisper-1, custom_llm_provider=groq. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json.\n
reasons:
I use groq whisper-large-v3, but the response_cost log still is model=whisper-1, custom_llm_provider=groq,
becasue groq use OpenAIAudioTranscription and it is hard code to set model name and provider
What happened?
S3 config
1. custom_pricing not working for Groq whisper-large-v3 (stt model)
reasons:
SPECIAL_MODEL_INFO_PARAMS
not containsinput_cost_per_second
, and we also need to addoutput_cost_per_second
for tts modellitellm/litellm/litellm_core_utils/litellm_logging.py
Lines 2560 to 2575 in 55139b8
litellm/litellm/types/router.py
Lines 373 to 378 in 55139b8
2.
model_prices_and_context_window_backup.json
not contains some groq stt modelswhisper-large-v3, whisper-large-v3-turbo, distil-whisper-large-v3-en
3. when i fix 1 and 2 on above issues, appear response_cost Exception
curl:
some error log for
response_cost
:Exception: This model isn't mapped yet. model=whisper-1, custom_llm_provider=groq. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json.\n
reasons:
I use
groq whisper-large-v3
, but the response_cost log still ismodel=whisper-1, custom_llm_provider=groq
,becasue groq use
OpenAIAudioTranscription
and it is hard code to set model name and providerlitellm/litellm/llms/openai/transcriptions/handler.py
Line 145 in 55139b8
4. groq stt miss some unit test
May i create a PR to fix this bug?
Relevant log output
Are you a ML Ops Team?
Yes
What LiteLLM version are you on ?
v1.56.9
Twitter / LinkedIn details
@Hugo_Liu_X
The text was updated successfully, but these errors were encountered: