1. Home
  2. /
  3. Blog
  4. /
  5. We have packaged Open...
Jan 29,2026 No comments yet By Origo

We have packaged Open WebUI for Origo OS

We have built and released a Open WebUI stack for Origo OS. This means that you can now install a local AI chat with a LLM of your choice in about a minute or two (depending on your download speed) in an Origo OS private or public cloud. Below are a few screen-shots of downloading, installing and using Open WebUI on Origo OS. You can of course do exactly the same using the Origo Public Cloud Toolkit.

  • The Stack is configured to ask for one GPU. If you want to try it out with only CPU (and you are prepared for a rather slow experience), click “Hide preconfigured settings” when installing, and set vGPUs to 0.
  • Our Open WebUI Stack uses Ollama as the backend. This means that you can install all the models from Ollama by typing “sudo ollama pull ‘model-name'” in a terminal.

  • Download the Stack to your Origo OS installation

Leave a Comment

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.

origo