lobehub/lobe-chat






















The issue has been closed
[Bug] Can't use Ollama custom model #1665
ntvuongg posted onGitHub
💻 Operating System
Ubuntu
📦 Environment
Docker
🌐 Browser
Chrome
🐛 Bug Description
Got this when trying to use ollama custom model:
{
"error": {
"headers": {
"content-length": "18",
"content-type": "text/plain",
"date": "Thu, 21 Mar 2024 08:38:46 GMT"
},
"stack": "Error: 404 404 page not found\n at eP.generate (/app/.next/server/edge-chunks/405.js:4:1754)\n at sp.makeStatusError (/app/.next/server/edge-chunks/405.js:4:13675)\n at sp.makeRequest (/app/.next/server/edge-chunks/405.js:4:14598)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async V.chat (/app/.next/server/edge-chunks/953.js:1:11226)\n at async eA (/app/.next/server/edge-chunks/953.js:1:20149)\n at async /app/.next/server/edge-chunks/212.js:6:64195\n at async O.execute (/app/.next/server/edge-chunks/212.js:6:61088)\n at async O.handle (/app/.next/server/edge-chunks/212.js:6:65462)\n at async ey.handler (/app/.next/server/edge-chunks/212.js:7:31644)",
"status": 404
},
"endpoint": "http://ollama:****/",
"provider": "ollama"
}
🚦 Expected Behavior
It should be called successfully and return response
📷 Recurrence Steps
- Compose up the app
- Select Ollama's custom model
- Send message
- Got error
📝 Additional Information
Here is my docker-compose.yaml:
version: '3.8'
services:
lobe-chat:
image: lobehub/lobe-chat
container_name: lobechat
restart: always
ports:
- '3210:3210'
user: root
environment:
# OPENAI_API_KEY: sk-xxxx
# OPENAI_PROXY_URL: https://api-proxy.com/v1
OLLAMA_PROXY_URL: http://host.docker.internal:11434/v1
ACCESS_CODE: lobe66
ollama:
volumes:
- /home/vuongnt/workspace/mimi/models:/ollama
container_name: ollama
pull_policy: always
tty: true
restart: unless-stopped
image: ollama/ollama:latest
environment:
- OLLAMA_HOST=0.0.0.0
ports:
- 11434:11434
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 2
capabilities: [gpu]
I also try to config in setting but not working:
Here is my ollama model (access container):