Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any plan to support to chatgpt? #28

Closed
LaiZhou opened this issue Dec 19, 2024 · 9 comments
Closed

Any plan to support to chatgpt? #28

LaiZhou opened this issue Dec 19, 2024 · 9 comments

Comments

@LaiZhou
Copy link

LaiZhou commented Dec 19, 2024

Any plan to support to chatgpt?

@csells
Copy link
Contributor

csells commented Dec 19, 2024

ChatGPT is a chat app built around the LLM API provided by OpenAI. @domesticmouse and I are working with @davidmigloz on support for OpenAI LLMs right now (among others): #30

@bernaferrari
Copy link

I wonder how that is going to work without leaking your OpenAI keys in the app. On web there is NextJS which separates server and app, but I don't think anyone did that on Flutter yet...

@csells
Copy link
Contributor

csells commented Dec 20, 2024

You can certainly build your own server-side proxy and plug that into the AI Toolkit.

@bernaferrari
Copy link

Maybe you could have that somewhere in AI Toolkit, because it feels like a very common request for people (without that, they can't use the AI Toolkit in production)

@csells
Copy link
Contributor

csells commented Dec 20, 2024

You certainly can. Vertex has support for that.

@bernaferrari
Copy link

But Vertex doesn't support OpenAI

@davidmigloz
Copy link
Contributor

You can build a simple proxy as @csells was mentioning, eg. using Firebase Cloud Functions.

@DominikStarke
Copy link

I've created a minimal open-webui api client as LlmProvider implementation.
I'm currently using it with ollama (local model), but since open-webui also supports chat-gpt it should be possible to use it as well.

LlmChatView(
  provider: OpenwebuiProvider(
    host: 'http://127.0.0.1:3000', /// Open-webui host
    model: 'llama3.1:latest', /// The chat-model you want to use, for example gpt-4o-mini, 
    apiKey: "YOUR_API_KEY", /// Open-webui api key, not the api key of external llm provider,
    history: [] /// Previous conversation
  ),
)

The fork can be found here

@csells
Copy link
Contributor

csells commented Dec 21, 2024

hey, @DominikStarke. if you want to contribute your open-webui provider, you can do so into the
flutter_ai_community repo that @davidmigloz and I are pulling together. I created an issue on that repo.

@csells csells closed this as completed Dec 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants