Integrate with OpenWebUI #360
Replies: 6 comments 23 replies
-
You could make a tool/function in openwebui that uses perplexica's api, and have both openwebui and perplexica use your local LLM serving (ollama, tabby, ...). |
Beta Was this translation helpful? Give feedback.
-
I have modified the docker-compose.yaml file to include open-webui and pipelines it appears to be working, don't know how it will handle updates. |
Beta Was this translation helpful? Give feedback.
-
You can use this : https://openwebui.com/f/gwaanl/perplexica_pipe |
Beta Was this translation helpful? Give feedback.
-
I've installed your new version, I put the IP, ollama and the installed models, a soon as I've made this OpenWebUi has been frozen. so something is wrong. I'll try to figure how to uninstall it. I'm use a Mac Mini Pro with the latest version of MacOs installed. I'll ask on the OpenWebUi how to unistall the pipe, if you have a suggestion, i'm all ears |
Beta Was this translation helpful? Give feedback.
-
I have attached the Docker log. If you know how to delete de damaged files and how to start from new without deleting all done on OpenWebUi, will be great |
Beta Was this translation helpful? Give feedback.
-
I've installed the new version and It's working sometimes, but as soon as I try to change the configuration I have problems and I need to restart the OpenWebUi docker image, and sometimes I need to do it twice to get it working. But shoud be some detail in the code and the pipe is working I'm using the last version of openWebUi and Perplexica |
Beta Was this translation helpful? Give feedback.
-
OpenWebUI is a great UI for general chatting with local LLMs. I think that Perplexica is very complimentary. What do you think about an integration?
Beta Was this translation helpful? Give feedback.
All reactions