How to proxy openwebui and nginx

How to proxy openwebui and nginx with websocket support

March 16, 2025·bill
bill

The openwebui application uses websockets to stream data directly back to clients, data containing chat information. When you’re using nginx as a reverse proxy for a webserver running openwebui, you can see the docs provide a pretty vanilla suggested config for nginx.

server {
    listen 80;
    server_name your_domain_or_IP;

    location / {
        proxy_pass http://host.docker.internal:3000;

        # Add WebSocket support (Necessary for version 0.5.0 and up)
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";

        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;

        # (Optional) Disable proxy buffering for better streaming response from models
        proxy_buffering off;
    }
}

This contains some important lines, especially the ones supporting websocket communication. These are actually important and you do need these.

proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";

The proxy_http_version line forces HTTP1.1 which is the middle ground between HTTP/1.0 and HTTP/2. The upgrade and connection headers are required for websocket support.

You are going to be confronted with a little message indicating a “network error” in the box that would have held the response from the LLM. If you look at the connection to the webserver, you will see that it times out. It’s the websocket connection specifically which is timing out. You can open the browser inspector and see this in the network tab.

The fix is to go back into your nginx config and add a few lines:

proxy_read_timeout 300s;
proxy_connect_timeout 75s;

final config

server {
    listen 80;
    server_name your_domain_or_IP;

    location / {
        proxy_pass http://host.docker.internal:3000;

        # Add WebSocket support (Necessary for version 0.5.0 and up)
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
        proxy_read_timeout 300s;
        proxy_connect_timeout 75s;

        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;

        # (Optional) Disable proxy buffering for better streaming response from models
        proxy_buffering off;
    }
}

I haven’t tested the limits to see how low or high you can set these values, but they immediately helped fix the issue. It’s likely the read timeout needs to be greater than the time to the first response back from the LLM. Maybe someone can test this out.