Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Ollama details to Local LLM doc #14

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

jmatthiesen
Copy link

Purpose

Does this introduce a breaking change?

[ ] Yes
[x] No

Pull Request Type

What kind of change does this Pull Request introduce?

[ ] Bugfix
[ ] Feature
[ ] Code style update (formatting, local variables)
[ ] Refactoring (no functional changes, no api changes)
[x] Documentation content changes
[ ] Other... Please describe:

@@ -4,16 +4,32 @@ You may want to save costs by developing against a local LLM server, such as
[llamafile](https://github.com/Mozilla-Ocho/llamafile/). Note that a local LLM
will generally be slower and not as sophisticated.

Once you've got your local LLM running and serving an OpenAI-compatible endpoint, define `LOCAL_OPENAI_ENDPOINT` in your `.env` file.
Once you've got your local LLM running and serving an OpenAI-compatible endpoint, define `LOCAL_MODELS_ENDPOINT` in your `.env` file.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, why is it LOCAL_MODELS?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like another difference between this and the openai-chat-app-quickstart sample. That one uses LOCAL_MODELS_ENDPOINT. I like that name more than LOCAL_OPENAI_ENDPOINT, but your call if you want to align the two repos. Happy to edit this PR either way.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ohh.. Hmm. I like LOCAL_OPENAI in terms of it being an openai-compatible endpoint, but LOCAL_MODELS also nice in that youre using local models. I'm fine with LOCAL_MODELS change, please modify rest of repo accordingly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants