You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I wonder if it would be possible to add compatibility to ollama downloaded models (e.g. deepseek r1) in here. Would be useful when internet is unavailable.
The text was updated successfully, but these errors were encountered:
I think CopilotChat.nvim, which wraps requests to the GitHub Copilot server, could behave very well with different models. In the end, it's just a matter of doing CURL requests and getting the output. The "problem" is that, by design choice, CopilotChat.nvim will probably only support the GitHub Copilot models.
If one day we'll be able to use the plugin also for other providers (not just Copilot), I suppose many would love that.
With #494 merged I started thinking about provider structure so both copilot chat and marketplace models are handled in nicer way than in 1 class. For now i did not do it yet, but when I will figure out some nice interface, it will also allow specifying custom providers in future
Hi, I wonder if it would be possible to add compatibility to ollama downloaded models (e.g. deepseek r1) in here. Would be useful when internet is unavailable.
The text was updated successfully, but these errors were encountered: