-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Any plan to support to chatgpt? #28
Comments
ChatGPT is a chat app built around the LLM API provided by OpenAI. @domesticmouse and I are working with @davidmigloz on support for OpenAI LLMs right now (among others): #30 |
I wonder how that is going to work without leaking your OpenAI keys in the app. On web there is NextJS which separates server and app, but I don't think anyone did that on Flutter yet... |
You can certainly build your own server-side proxy and plug that into the AI Toolkit. |
Maybe you could have that somewhere in AI Toolkit, because it feels like a very common request for people (without that, they can't use the AI Toolkit in production) |
You certainly can. Vertex has support for that. |
But Vertex doesn't support OpenAI |
You can build a simple proxy as @csells was mentioning, eg. using Firebase Cloud Functions. |
I've created a minimal open-webui api client as LlmProvider implementation. LlmChatView(
provider: OpenwebuiProvider(
host: 'http://127.0.0.1:3000', /// Open-webui host
model: 'llama3.1:latest', /// The chat-model you want to use, for example gpt-4o-mini,
apiKey: "YOUR_API_KEY", /// Open-webui api key, not the api key of external llm provider,
history: [] /// Previous conversation
),
) The fork can be found here |
hey, @DominikStarke. if you want to contribute your open-webui provider, you can do so into the |
Any plan to support to chatgpt?
The text was updated successfully, but these errors were encountered: