lobehub/lobe-chat

[Bug] Can't use Ollama custom model #1665

ntvuongg posted onGitHub

💻 Operating System

Ubuntu

📦 Environment

Docker

🌐 Browser

Chrome

🐛 Bug Description

Got this when trying to use ollama custom model:

{
  "error": {
    "headers": {
      "content-length": "18",
      "content-type": "text/plain",
      "date": "Thu, 21 Mar 2024 08:38:46 GMT"
    },
    "stack": "Error: 404 404 page not found\n    at eP.generate (/app/.next/server/edge-chunks/405.js:4:1754)\n    at sp.makeStatusError (/app/.next/server/edge-chunks/405.js:4:13675)\n    at sp.makeRequest (/app/.next/server/edge-chunks/405.js:4:14598)\n    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n    at async V.chat (/app/.next/server/edge-chunks/953.js:1:11226)\n    at async eA (/app/.next/server/edge-chunks/953.js:1:20149)\n    at async /app/.next/server/edge-chunks/212.js:6:64195\n    at async O.execute (/app/.next/server/edge-chunks/212.js:6:61088)\n    at async O.handle (/app/.next/server/edge-chunks/212.js:6:65462)\n    at async ey.handler (/app/.next/server/edge-chunks/212.js:7:31644)",
    "status": 404
  },
  "endpoint": "http://ollama:****/",
  "provider": "ollama"
}

🚦 Expected Behavior

It should be called successfully and return response

📷 Recurrence Steps

  1. Compose up the app
  2. Select Ollama's custom model
  3. Send message
  4. Got error

📝 Additional Information

Here is my docker-compose.yaml:

version: '3.8'

services:
  lobe-chat:
    image: lobehub/lobe-chat
    container_name: lobechat
    restart: always
    ports:
      - '3210:3210'
    user: root
    environment:
      # OPENAI_API_KEY: sk-xxxx
      # OPENAI_PROXY_URL: https://api-proxy.com/v1
      OLLAMA_PROXY_URL: http://host.docker.internal:11434/v1
      ACCESS_CODE: lobe66

  ollama:
    volumes:
      - /home/vuongnt/workspace/mimi/models:/ollama

    container_name: ollama
    pull_policy: always
    tty: true
    restart: unless-stopped
    image: ollama/ollama:latest
    environment:
      - OLLAMA_HOST=0.0.0.0
    ports:
      - 11434:11434
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: 2
              capabilities: [gpu]

I also try to config in setting but not working: Screenshot 2024-03-21 at 15 46 12 Here is my ollama model (access container): Screenshot 2024-03-21 at 15 47 05


👀 @ntvuongg

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

posted by lobehubbot 12 months ago

需要使用宿主机的IP,lobe不清楚docker的ID

posted by MapleEve 12 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


You need to use the IP of the host machine. Lobe does not know the docker ID.

posted by lobehubbot 12 months ago

需要使用宿主机的IP,lobe不清楚docker的ID

I tried with ollama's container IP but error still remain.

Here is ollama's container IP address:

<img width="616" alt="Screenshot 2024-03-22 at 00 50 20" src="https://github.com/lobehub/lobe-chat/assets/65907920/8ec1317c-db5f-464d-b20b-4b97f38b0b23">

Access lobe-chat's container and try to check if it can access ollama host: <img width="300" alt="Screenshot 2024-03-22 at 00 51 50" src="https://github.com/lobehub/lobe-chat/assets/65907920/a31c7bc4-3264-4986-9327-5f4504bc1d96">

Error still remain: <img width="1058" alt="Screenshot 2024-03-22 at 00 53 34" src="https://github.com/lobehub/lobe-chat/assets/65907920/027d99de-172d-4d64-986b-2a974e6380c1">

posted by ntvuongg 12 months ago

请看文档,LobeChat 的请求如果是 OpenAI 规范需要在 uri 地方加上 /v1

posted by MapleEve 12 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Please see the documentation. If LobeChat's request is OpenAI specification, you need to add /v1 to the uri.

posted by lobehubbot 12 months ago

请看文档,LobeChat 的请求如果是 OpenAI 规范需要在 uri 地方加上 /v1 Added and still not working <img width="1065" alt="Screenshot 2024-03-22 at 13 11 35" src="https://github.com/lobehub/lobe-chat/assets/65907920/d58cbfc5-cd81-4942-ac46-e5d508fce37e">

posted by ntvuongg 12 months ago

Maybe I need to figure out how to configure with docker compose…

posted by arvinxx 12 months ago

Maybe I need to figure out how to configure with docker compose…

我目前也正是发生这个问题!

This is exactly the problem I am having right now!

link

你看看是不是CORS导致发生这个问题 我用0.136版本是可以连接IP地址 但0.137以上版本就是发生这个问题

Please check if CORS is causing this problem. I can connect to the IP address using version 0.136, but this problem occurs with versions 0.137 or above.

posted by cheungpatrick 12 months ago

Maybe I need to figure out how to configure with docker compose… 请有空 麻烦检查一下是否CORS 导致发生这个问题 我用使用localhost 是没有问题 !

Please have time to check if CORS is causing this problem to occur I use localhost is no problem !

posted by cheungpatrick 12 months ago

Maybe I need to figure out how to configure with docker compose…

我目前也正是发生这个问题!

This is exactly the problem I am having right now!

link

你看看是不是CORS导致发生这个问题 我用0.136版本是可以连接IP地址 但0.137以上版本就是发生这个问题

Please check if CORS is causing this problem. I can connect to the IP address using version 0.136, but this problem occurs with versions 0.137 or above.

You're right! It worked with v0.136

posted by ntvuongg 12 months ago

I confirm it works with v0.1.136 for me too.

posted by darkvertex 12 months ago

I also face the same issue. However, I cannot pull the image of 0.136 via docker anymore. Is this still an open issue on the latest version?

posted by albertmx 12 months ago

I also face the same issue. However, I cannot pull the image of 0.136 via docker anymore. Is this still an open issue on the latest version?

The v0.166.0 tag still exists in DockerHub: https://hub.docker.com/layers/lobehub/lobe-chat/v0.136.0/images/sha256-76509d3dcde3e64ae2569fa0270d910c0d079cad7a5dd1cdcc00c96480fae69b?context=explore

Did you forget the "v" when pulling the tag maybe?

posted by darkvertex 12 months ago

but nothing helps I think the docker container can't reach the ip adress!

posted by alfi4000 11 months ago

I am using this command to run it: sudo docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://192.168.50.53:11435/v1 lobehub/lobe-chat

posted by alfi4000 11 months ago

✅ @ntvuongg

This issue is closed, If you have any questions, you can comment and reply.
此问题已经关闭。如果您有任何问题,可以留言并回复。

posted by lobehubbot 11 months ago

:tada: This issue has been resolved in version 0.149.0 :tada:

The release is available on:

Your semantic-release bot :package::rocket:

posted by lobehubbot 11 months ago

Got 404 from ollama on macOS : brand new installation of lobechat via docker v0.155.4

1.tried to run:

launchctl setenv OLLAMA_ORIGINS "*"
launchctl setenv OLLAMA_HOST "0.0.0.0"
docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://host.docker.internal:11434/v1 lobehub/lobe-chat
or
docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://127.0.0.1:11434/v1 lobehub/lobe-chat

no luck.

2.tried to add http://127.0.0.1:11434/v1 on the settings page. no luck.

comfirmed that ollama is listening on 0.0.0.0:11434

posted by unizhu 10 months ago

@unizhu remove v1 ?

posted by arvinxx 10 months ago

@ntvuongg @albertmx @alfi4000 @unizhu

I followed the steps below and it work:

  1. Run the following command:
    docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://host.docker.internal:11434/v1 lobehub/lobe-chat
  2. Open http://localhost:3210/, then click on "Session Settings" in the top right corner.
  3. In the "Language Model" section, set the "Ollama's Interface proxy address" to http://127.0.0.1:11434.
  4. Enable "Use Client-Side Fetching Mode" (this is a crucial step).

    <img width="999" alt="image" src="https://github.com/lobehub/lobe-chat/assets/7893589/0cb92fc8-b1ce-4901-a7bc-668485b7569c"> <img width="424" alt="image" src="https://github.com/lobehub/lobe-chat/assets/7893589/46dfe60d-32bd-441e-b0fb-a4a330f9839a">

posted by 17khba 10 months ago

Fund this Issue

$0.00
Funded

Pull requests