Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot use ollama/llama3.3 #875

Open
yazeedalrubyli opened this issue Jan 9, 2025 · 1 comment
Open

Cannot use ollama/llama3.3 #875

yazeedalrubyli opened this issue Jan 9, 2025 · 1 comment
Labels
enhancement New feature or request

Comments

@yazeedalrubyli
Copy link

yazeedalrubyli commented Jan 9, 2025

**Is your feature request related to a problem? **
I can use Ollama models up to llama3.2. What about the new ones?

Describe the solution you'd like
Addeingsupport of models llama3.3, phi4

@yazeedalrubyli yazeedalrubyli changed the title Can not use ollama/llama3.3 Cannot use ollama/llama3.3 Jan 9, 2025
@dosubot dosubot bot added the enhancement New feature or request label Jan 9, 2025
@PeriniM
Copy link
Collaborator

PeriniM commented Jan 10, 2025

Hey @yazeedalrubyli do you get some errors? Have you tried explicitily setting the model_tokens in the graph config?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants