Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support MCP tools with models other than Anthropic #3672

Open
2 tasks done
prd-tuong-nguyen opened this issue Jan 10, 2025 · 5 comments
Open
2 tasks done

Support MCP tools with models other than Anthropic #3672

prd-tuong-nguyen opened this issue Jan 10, 2025 · 5 comments
Assignees
Labels
kind:enhancement Indicates a new feature request, imrovement, or extension "needs-triage"

Comments

@prd-tuong-nguyen
Copy link

prd-tuong-nguyen commented Jan 10, 2025

Validations

  • I believe this is a way to improve. I'll try to join the Continue Discord for questions
  • I'm not able to find an open issue that requests the same enhancement

Problem

We can currently only use the tools with Anthropic

Solution

Permit all LLM to use MCP's tools.

@dosubot dosubot bot added the kind:enhancement Indicates a new feature request, imrovement, or extension label Jan 10, 2025
@xiaoshyang
Copy link

I am using an openai-api compatible interface calling method to call the claude deployed by AWS. Please provide support as well. thank you

@VikashLoomba
Copy link

+1, no real reason to limit the tools to the Anthropic provider and Claude only. Plenty of open source models support tool use

@shermanhuman
Copy link
Contributor

Question: Is this something we can accomplish in a generic way? Can we write one thing, and then possibly add a configuration parameter to turn it on for each model - or maintain a database of models / providers that have that capability?

Or is this something we need to implement on a provider by provider basis? If that's the case instead of asking for "all LLM" we might want to start compiling a list of precisely what is wanted. @prd-tuong-nguyen @xiaoshyang @VikashLoomba could you elaborate a bit on what you are trying to do with tools and what providers and APIs and models you would like tool support on?

@prd-tuong-nguyen
Copy link
Author

prd-tuong-nguyen commented Jan 14, 2025

@shermanhuman I think we should prioritize support for the OpenAI Compatible format, as nearly all LLM serving frameworks and providers support this format. In other words, if we support OpenAI Compatible, we support nearly "all LLM"

@VikashLoomba
Copy link

I agree with what @prd-tuong-nguyen said. If we can enable OpenAI based tool calling (perhaps by a per-model setting "tools_enabled" in config.json) first, that would be most useful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind:enhancement Indicates a new feature request, imrovement, or extension "needs-triage"
Projects
None yet
Development

No branches or pull requests

5 participants