WeSearch

Show HN: Secure-by-default Ollama Docker image with built-in auth, only 70MB

·10 min read · 0 reactions · 0 comments · 5 views
Show HN: Secure-by-default Ollama Docker image with built-in auth, only 70MB

Docker image to run an Ollama local LLM server. Secure by default, all API requests require a Bearer token (auto-generated on first start). OpenAI-compatible API. Supports first-start model pre-pul...

Original article
GitHub
Read full at GitHub →
Opening excerpt (first ~120 words) tap to expand

English | 简体中文 | 繁體中文 | Русский Ollama on Docker Docker image to run an Ollama local LLM server. Provides an OpenAI-compatible API for running large language models locally. Based on Debian Trixie (slim). Designed to be simple, private, and secure by default. Features: Secure by default — all API requests require a Bearer token (auto-generated on first start) Auto-generates an API key on first start, stored in the persistent volume First-start model pre-pull via OLLAMA_MODELS environment variable Model management via a helper script (ollama_manage) OpenAI-compatible API — point any OpenAI SDK or app at your local server with a one-line change Caddy reverse proxy enforces Bearer token auth on all API requests (except / health check) NVIDIA GPU (CUDA) acceleration for faster inference…

Excerpt limited to ~120 words for fair-use compliance. The full article is at GitHub.

Anonymous · no account needed
Share 𝕏 Facebook Reddit LinkedIn Threads WhatsApp Bluesky Mastodon Email

Discussion

0 comments

More from GitHub