Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: run locally with ollama #723

Open
miguelfs opened this issue Jan 24, 2025 · 2 comments
Open

Feature request: run locally with ollama #723

miguelfs opened this issue Jan 24, 2025 · 2 comments
Labels
enhancement New feature or request

Comments

@miguelfs
Copy link

miguelfs commented Jan 24, 2025

Hi, I wonder if it would be possible to add compatibility to ollama downloaded models (e.g. deepseek r1) in here. Would be useful when internet is unavailable.

Image
@jellydn jellydn added the enhancement New feature or request label Jan 25, 2025
@pidgeon777
Copy link

pidgeon777 commented Jan 26, 2025

I think CopilotChat.nvim, which wraps requests to the GitHub Copilot server, could behave very well with different models. In the end, it's just a matter of doing CURL requests and getting the output. The "problem" is that, by design choice, CopilotChat.nvim will probably only support the GitHub Copilot models.

If one day we'll be able to use the plugin also for other providers (not just Copilot), I suppose many would love that.

@deathbeam
Copy link
Collaborator

With #494 merged I started thinking about provider structure so both copilot chat and marketplace models are handled in nicer way than in 1 class. For now i did not do it yet, but when I will figure out some nice interface, it will also allow specifying custom providers in future

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants