Skip to content

Latest commit

Β 

History

History
77 lines (54 loc) Β· 4.11 KB

File metadata and controls

77 lines (54 loc) Β· 4.11 KB

Open WebUI Desktop

Version Downloads Discord License

Open WebUI Desktop

Your AI, right on your desktop. Open WebUI as a native app. Run models locally or connect to any server. No Docker, no terminal, no setup. Download, launch, chat.

Warning

Early Alpha. Things move fast and stuff might break. Report bugs or come hang out on Discord.

Download

Platform Installer
macOS (Apple Silicon) Download .dmg
macOS (Intel) Download .dmg
Windows x64 Download .exe
Linux (AppImage) Download .AppImage
Linux (Debian/Ubuntu) Download .deb
Linux (Snap) Download .snap
Linux (Flatpak) Download .flatpak

Internet required on first launch. After that, everything works offline. All releases β†’

How It Works

πŸ–₯️ Run locally. The app runs Open WebUI on your machine. You can optionally enable the built-in llama.cpp engine to download and run models offline. Nothing leaves your computer.

☁️ Connect remotely. Point the app at any Open WebUI server. Switch between multiple connections from the sidebar.

Use both at the same time.

Highlights

  • ⚑ Spotlight. Hit Shift+Cmd+I (macOS) or Shift+Ctrl+I (Windows/Linux) to summon a floating chat bar over whatever you're doing. Drag to screenshot anything on screen.
  • πŸŽ™οΈ Voice input. System-wide push-to-talk. Press the shortcut from any app to record, and your speech is transcribed and sent to your chat automatically.
  • 🧠 Local inference. Optionally run models entirely on your hardware via the built-in llama.cpp engine. Your data never leaves your machine.
  • 🎯 One-click setup. Launch and connect to a server in seconds. Local models can be enabled from the settings.
  • πŸ”Œ Multiple connections. Juggle servers and switch between them instantly.
  • πŸ”„ Auto-updates. New releases land in the background.
  • πŸ“‘ Offline-ready. No internet needed after initial setup.
  • πŸ’» Cross-platform. macOS, Windows, and Linux.

System Requirements

Local Models Remote Only
Disk 5 GB+ ~500 MB
RAM 16 GB+ 4 GB
OS macOS 12+, Windows 10+, modern Linux Same

Note

Local models need serious RAM (7B β‰ˆ 8 GB, 13B β‰ˆ 16 GB). Lighter machine? Connect to a remote server instead.

Privacy

No telemetry. No tracking. No phone-home. Your conversations stay on your machine. Period.

Community

  • πŸ’¬ Discord - Come hang out
  • πŸ› Issues - Report bugs or request features
  • 🌐 Open WebUI - The main project
  • πŸ“– Docs - Full documentation

Contributing

npm install
npm run dev

See CHANGELOG.md for release history. Licensed under AGPL-3.0.