Community
![]()
Open WebUI#
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution.
Prerequisites#
Deploy k0rdent v1.4.0: QuickStart
Deploy Ingress-nginx to expose application web UI
Install template to k0rdent#
helm upgrade --install open-webui oci://ghcr.io/k0rdent/catalog/charts/kgst --set "chart=open-webui:8.10.0" -n kcm-system
Verify service template#
Deploy service template#
Configuration without GPU#
Tested on worker instanceType: t3.xlarge and rootVolumeSize: 32:
apiVersion: k0rdent.mirantis.com/v1beta1
kind: ClusterDeployment
metadata:
name: aws-example
spec:
template: aws-standalone-cp-1-0-14
credential: aws-credential
config:
...
worker:
instanceType: t3.xlarge
rootVolumeSize: 32
workersNumber: 1
Tested service configuration:
apiVersion: k0rdent.mirantis.com/v1beta1
kind: MultiClusterService
metadata:
name: open-webui
spec:
clusterSelector:
matchLabels:
group: demo
serviceSpec:
services:
- template: open-webui-8-10-0
name: open-webui
namespace: open-webui
values: |
open-webui:
ollama:
ollama:
models:
pull: [smollm:135m]
run: [smollm:135m]
ingress:
enabled: true
class: "nginx"
host: 'openwebui.example.com'
Configuration with GPU#
This setup requires corresponding cluster setup, see NVIDIA GPU Operator
apiVersion: k0rdent.mirantis.com/v1beta1
kind: MultiClusterService
metadata:
name: open-webui
spec:
clusterSelector:
matchLabels:
group: demo
serviceSpec:
services:
- template: open-webui-8-10-0
name: open-webui
namespace: open-webui
values: |
open-webui:
ollama:
ollama:
gpu:
enabled: true
type: 'nvidia'
number: 1
models:
pull: [llama3.2:3b]
run: [llama3.2:3b]
ingress:
enabled: true
class: "nginx"
host: 'openwebui.example.com'