Need of aiohttp based client support across different providers supported by litellm. #7530
Unanswered
rachitchauhan43
asked this question in
Q&A
Replies: 1 comment 1 reply
-
here is the plan @rachitchauhan43 #7544 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In the context of recently uncovered performance issues with httpx, litellm team has released the fix to use aiohttp based client for OpenAI models. Great job on that
This same support, of using aiohttp based client, is needed across all providers that use httpx based client today in litellm so that all those provider integrations can be performant.
Open questions:
Beta Was this translation helpful? Give feedback.
All reactions