Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]ollama私有化部署,使用nextchat docker启动服务如何配置 #6304

Open
oui-a11y opened this issue Feb 27, 2025 · 27 comments
Open
Labels
bug Something isn't working

Comments

@oui-a11y
Copy link

📦 部署方式

Docker

📌 软件版本

v2.15.8

💻 系统环境

Other Linux

📌 系统版本

7

🌐 浏览器

Chrome

📌 浏览器版本

latest

🐛 问题描述

通过ollama私有化部署deepseek后,我想使用nextchat给团队使用。通过docker启动时,配置参数如下
docker run -d -p 3000:3000 -e BASE_URL=http://127.0.0.1:11434 -e CUSTOM_MODELS="+deepseek-r1-14b" -e DEFAULT_MODEL="deepseek-r1:14b" yidadaa/chatgpt-next-web
访问页面后

Image
没有使用到自定义接口定义 导致功能无法使用

提问:请问在我这种使用场景下,该如何配置启动信息

📷 复现步骤

No response

🚦 期望结果

No response

📝 补充信息

No response

@oui-a11y oui-a11y added the bug Something isn't working label Feb 27, 2025
@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


📦 Deployment method

Docker

📌 Software version

v2.15.8

💻 System environment

Other Linux

📌 System version

7

🌐 Browser

Chrome

📌 Browser version

Latest

🐛 Question description

After deploying deepseek via ollama private, I want to use nextchat for the team to use. When starting via docker, the configuration parameters are as follows
docker run -d -p 3000:3000 -e BASE_URL=http://127.0.0.1:11434 -e CUSTOM_MODELS="+deepseek-r1-14b" -e DEFAULT_MODEL="deepseek-r1:14b" yidadaa/chatgpt-next-web
After accessing the page

Image
The custom interface definition is not used, resulting in the function being unusable

Question: How to configure startup information in my usage scenario

📷 Reproduction steps

No response

🚦 Expected results

No response

📝 Supplementary information

No response

@oui-a11y oui-a11y changed the title [Bug] [Bug]ollama私有化部署,使用nextchat docker启动服务如何配置 Feb 27, 2025
@amlkiller
Copy link

你应该填ollama在docker的ip而不是127.0.0.1,建议学懂运维再来做技术。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


You should fill in the ip of ollama in docker instead of 127.0.0.1. It is recommended to learn operation and maintenance before doing the technology.

@oui-a11y
Copy link
Author

@amlkiller 同一台机器 有何区别。。。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@amlkiller What is the difference between the same machine. . .

@amlkiller
Copy link

@amlkiller 同一台机器 有何区别。。。

docker的127是虚拟机的本机,不是物理机的127。去看下docker的文档吧。你找个教程跟着做。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@amlkiller What is the difference between the same machine. . .

Docker's 127 is the native of the virtual machine, not the 127 of the physical machine. Go and read the docker documentation. You find a tutorial to follow.

@oui-a11y
Copy link
Author

@amlkiller 同一台机器 有何区别。。。

docker的127是虚拟机的本机,不是物理机的127。去看下docker的文档吧。你找个教程跟着做。

那我根本没有配置BASE_URL也一样可以使用 又是为什么呢? 我只是虚心讨教,别上来就上纲上线的。懂一点东西 感觉啥都懂一样

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@amlkiller What is the difference between the same machine. . .

docker's 127 is the native of the virtual machine, not the 127 of the physical machine. Go and read the docker documentation. You find a tutorial to follow.

Then I didn't configure BASE_URL at all and it can be used. Why? I'm just asking for advice humbly, don't just exaggerate it. Understand something, I feel like I understand everything

@oui-a11y
Copy link
Author

@amlkiller 同一台机器 有何区别。。。

docker的127是虚拟机的本机,不是物理机的127。去看下docker的文档吧。你找个教程跟着做。

简单来说。就是目前的nextchat 确实有问题,在不绑定任何配置的情况下 同一台机器,依旧可以使用 而且在使用期间,使用的模型会自动变动,这些是问题而已。不在乎什么是不是127.0.0.1 能连接不就可以了么。有什么问题

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@amlkiller What is the difference between the same machine. . .

docker's 127 is the native of the virtual machine, not the 127 of the physical machine. Go and read the docker documentation. You find a tutorial to follow.

Simply put. There is indeed a problem with the current nextchat. The same machine can still be used without binding any configuration. And during use, the model used will automatically change. These are just problems. Don't care if 127.0.0.1 is enough to connect. What's the problem

@amlkiller
Copy link

amlkiller commented Feb 27, 2025

@amlkiller 同一台机器 有何区别。。。

docker的127是虚拟机的本机,不是物理机的127。去看下docker的文档吧。你找个教程跟着做。

简单来说。就是目前的nextchat 确实有问题,在不绑定任何配置的情况下 同一台机器,依旧可以使用 而且在使用期间,使用的模型会自动变动,这些是问题而已。不在乎什么是不是127.0.0.1 能连接不就可以了么。有什么问题

我语气不好致歉,但我确实认为这是运维技术问题。
BASE_URL 是在不自定义接口的情况 ,使用提供方服务使用的,你填写localhost就等于问虚拟机自己要,这对于你同一台当然没问题,但也可以说明你docker也有配置问题,ollama和nextchat应该是分开的两个dockers,nextchat的localhost地址不应该能请求到ollama。

BASE_URL=http://new-api:3000
OPENAI_API_KEY=sk-ZL8ndYKNiBspyQXqaX5pkZ0XcLMhvppElvMRYpTvhSJN2A8k
CODE=88888888

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@amlkiller What is the difference between the same machine. . .

Docker's 127 is the native of the virtual machine, not the 127 of the physical machine. Go and read the docker documentation. You find a tutorial to follow.

Simply put. There is indeed a problem with the current nextchat. The same machine can still be used without binding any configuration. And during use, the model used will automatically change. These are just problems. Don't care if 127.0.0.1 is enough to connect. What's the problem

I apologize for a bad tone, but I do think it is an operation and maintenance technical issue.
BASE_URL is used without a custom interface. Use the provider service. If you fill in localhost, it is equivalent to asking the virtual machine to ask the virtual machine for the same machine. Of course, it is OK for you, but it can also indicate that your docker also has configuration problems. Ollama and nextchat should be two separate dockers, and the localhost address of nextchat should not be able to request ollama.

@oui-a11y
Copy link
Author

@amlkiller 同一台机器 有何区别。。。

docker的127是虚拟机的本机,不是物理机的127。去看下docker的文档吧。你找个教程跟着做。

简单来说。就是目前的nextchat 确实有问题,在不绑定任何配置的情况下 同一台机器,依旧可以使用 而且在使用期间,使用的模型会自动变动,这些是问题而已。不在乎什么是不是127.0.0.1 能连接不就可以了么。有什么问题

我语气不好致歉,但我确实认为这是运维技术问题。 BASE_URL 是在不自定义接口的情况 ,使用提供方服务使用的,你填写localhost就等于问虚拟机自己要,这对于你同一台当然没问题,但也可以说明你docker也有配置问题,ollama和nextchat应该是分开的两个dockers,nextchat的localhost地址不应该能请求到ollama。

BASE_URL=http://new-api:3000
OPENAI_API_KEY=sk-ZL8ndYKNiBspyQXqaX5pkZ0XcLMhvppElvMRYpTvhSJN2A8k
CODE=88888888

我拉取的镜像,正常跑起来后,我没有使用BASE_URL,但还是可以访问ollama对应的服务。我其实很疑惑项目运行使用的配置。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@amlkiller What is the difference between the same machine. . .

Docker's 127 is the native of the virtual machine, not the 127 of the physical machine. Go and read the docker documentation. You find a tutorial to follow.

In short. There is indeed a problem with the current nextchat. The same machine can still be used without binding any configuration. And during use, the model used will automatically change. These are just problems. Don't care if 127.0.0.1 is enough to connect. What's the problem

I apologize for a bad tone, but I do think it is an operation and maintenance technical issue. BASE_URL is used without a custom interface. Use the provider service. If you fill in localhost, it is equivalent to asking the virtual machine to ask the virtual machine for the same machine. Of course, it is OK for you, but it can also indicate that your docker also has configuration problems. Ollama and nextchat should be two separate dockers, and the localhost address of nextchat should not be able to request ollama.

BASE_URL=http://new-api:3000
OPENAI_API_KEY=sk-ZL8ndYKNiBspyQXqaX5pkZ0XcLMhvppElvMRYpTvhSJN2A8k
CODE=888888888

I pulled the image. After running normally, I did not use BASE_URL, but I could still access the corresponding service of ollama. I'm actually very confused about the configuration used by the project.

@oui-a11y
Copy link
Author

@amlkiller 同一台机器 有何区别。。。

docker的127是虚拟机的本机,不是物理机的127。去看下docker的文档吧。你找个教程跟着做。

简单来说。就是目前的nextchat 确实有问题,在不绑定任何配置的情况下 同一台机器,依旧可以使用 而且在使用期间,使用的模型会自动变动,这些是问题而已。不在乎什么是不是127.0.0.1 能连接不就可以了么。有什么问题

我语气不好致歉,但我确实认为这是运维技术问题。 BASE_URL 是在不自定义接口的情况 ,使用提供方服务使用的,你填写localhost就等于问虚拟机自己要,这对于你同一台当然没问题,但也可以说明你docker也有配置问题,ollama和nextchat应该是分开的两个dockers,nextchat的localhost地址不应该能请求到ollama。

BASE_URL=http://new-api:3000
OPENAI_API_KEY=sk-ZL8ndYKNiBspyQXqaX5pkZ0XcLMhvppElvMRYpTvhSJN2A8k
CODE=88888888

我拉取的镜像,正常跑起来后,我没有使用BASE_URL,但还是可以访问ollama对应的服务。我其实很疑惑项目运行使用的配置。

Image 比如我现在没有配置BASE_URL,但是对话功能可以正常使用。我不太清楚原因

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@amlkiller What is the difference between the same machine. . .

docker's 127 is the native of the virtual machine, not the 127 of the physical machine. Go and read the docker documentation. You find a tutorial to follow.

In short. There is indeed a problem with the current nextchat. The same machine can still be used without binding any configuration. And during use, the model used will automatically change. These are just problems. Don't care if 127.0.0.1 is enough to connect. What's the problem

I apologize for a bad tone, but I do think it is an operation and maintenance technical issue. BASE_URL is used without a custom interface. Use the provider service. If you fill in localhost, it is equivalent to asking the virtual machine to ask the virtual machine for the same machine. Of course, it is OK for you, but it can also indicate that your docker also has configuration problems. Ollama and nextchat should be two separate dockers, and the localhost address of nextchat should not be able to request ollama.

BASE_URL=http://new-api:3000
OPENAI_API_KEY=sk-ZL8ndYKNiBspyQXqaX5pkZ0XcLMhvppElvMRYpTvhSJN2A8k
CODE=888888888

The image I pulled, after running normally, I did not use BASE_URL, but I could still access the corresponding service of ollama. I'm actually very confused about the configuration used by the project.

Image

For example, I have not configured BASE_URL now, but the dialogue function can be used normally. I don't know the reason

@amlkiller
Copy link

@amlkiller 同一台机器 有何区别。。。

docker的127是虚拟机的本机,不是物理机的127。去看下docker的文档吧。你找个教程跟着做。

简单来说。就是目前的nextchat 确实有问题,在不绑定任何配置的情况下 同一台机器,依旧可以使用 而且在使用期间,使用的模型会自动变动,这些是问题而已。不在乎什么是不是127.0.0.1 能连接不就可以了么。有什么问题

我语气不好致歉,但我确实认为这是运维技术问题。 BASE_URL 是在不自定义接口的情况 ,使用提供方服务使用的,你填写localhost就等于问虚拟机自己要,这对于你同一台当然没问题,但也可以说明你docker也有配置问题,ollama和nextchat应该是分开的两个dockers,nextchat的localhost地址不应该能请求到ollama。

BASE_URL=http://new-api:3000
OPENAI_API_KEY=sk-ZL8ndYKNiBspyQXqaX5pkZ0XcLMhvppElvMRYpTvhSJN2A8k
CODE=88888888

我拉取的镜像,正常跑起来后,我没有使用BASE_URL,但还是可以访问ollama对应的服务。我其实很疑惑项目运行使用的配置。

Image 比如我现在没有配置BASE_URL,但是对话功能可以正常使用。我不太清楚原因

这是前端自定义接口 baseurl是环境变量,真的去看下docker文档和运维吧。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@amlkiller What is the difference between the same machine. . .

docker's 127 is the native of the virtual machine, not the 127 of the physical machine. Go and read the docker documentation. You find a tutorial to follow.

In short. There is indeed a problem with the current nextchat. The same machine can still be used without binding any configuration. And during use, the model used will automatically change. These are just problems. Don't care if 127.0.0.1 is enough to connect. What's the problem

I apologize for a bad tone, but I do think it is an operation and maintenance technical issue. BASE_URL is used without a custom interface. Use the provider service. If you fill in localhost, it is equivalent to asking the virtual machine to ask the virtual machine for the same machine. Of course, it is OK for you, but it can also indicate that your docker also has configuration problems. Ollama and nextchat should be two separate dockers, and the localhost address of nextchat should not be able to request ollama.

BASE_URL=http://new-api:3000
OPENAI_API_KEY=sk-ZL8ndYKNiBspyQXqaX5pkZ0XcLMhvppElvMRYpTvhSJN2A8k
CODE=888888888

I pulled the image. After running normally, I did not use BASE_URL, but I could still access the corresponding service of ollama. I'm actually very confused about the configuration used by the project.

Image For example, I have not configured BASE_URL now, but the dialogue function can be used normally. I don't know the reason

This is a front-end custom interface. Baseurl is an environment variable. I really go to see the docker documentation and operation and maintenance.

@Kosette
Copy link
Contributor

Kosette commented Mar 1, 2025

项目默认的BASE_URL是OpenAI,能连上ollama? 灵异了?

如果是docker desktop,可以使用host.docker.internal访问宿主机端口,如果是Linux直接使用host网络,稍微看一下文档,还是比较容易部署的。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


The default BASE_URL of the project is OpenAI, can it be connected to ollama? Supernatural?

If it is docker desktop, you can use host.docker.internal to access the host port. If it is Linux, it is easier to deploy if it is using the host network directly.

@oui-a11y
Copy link
Author

oui-a11y commented Mar 1, 2025

项目默认的BASE_URL是OpenAI,能连上ollama? 灵异了?

如果是docker desktop,可以使用host.docker.internal访问宿主机端口,如果是Linux直接使用host网络,稍微看一下文档,还是比较容易部署的。

额 我还是懂这些的。但是部署后确实存在这个问题。 我截图之类的信息 已经证明了。
我知道看文档,我也看了文档,docker状态都是正常,和docker network也没什么关系。 我现在的问题简单来说。就是按照命令启动后,页面可以直接访问ollama,访问nextchat配置信息,我在上一个帖子已截图。
我也是十分疑惑,所以才来求助。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


The default BASE_URL of the project is OpenAI, can it be connected to ollama? Supernatural?

If it is docker desktop, you can use host.docker.internal to access the host port. If it is Linux, it is easier to deploy if you use the host network directly.

Well, I still understand these. But this problem does exist after deployment. Information like my screenshots have proved.
I know to read the document, and I also read the document. The docker state is normal and has nothing to do with the docker network. My current problem is simply put. After starting according to the command, the page can directly access ollama and access nextchat configuration information. I have taken a screenshot in the previous post.
I was also very confused, so I came to ask for help.

@oui-a11y
Copy link
Author

oui-a11y commented Mar 1, 2025

项目默认的BASE_URL是OpenAI,能连上ollama? 灵异了?

如果是docker desktop,可以使用host.docker.internal访问宿主机端口,如果是Linux直接使用host网络,稍微看一下文档,还是比较容易部署的。

Image 如果是这个配置,请问没有配置自定义ollama,为啥可以正常对话,是不是很疑惑?

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


The default BASE_URL of the project is OpenAI, can it be connected to ollama? Supernatural?

If it is docker desktop, you can use host.docker.internal to access the host port. If it is Linux, it is easier to deploy if you use the host network directly.

Image If this is the configuration, why can I have a normal conversation without custom ollama? Isn't it very confusing?

@oui-a11y
Copy link
Author

oui-a11y commented Mar 1, 2025

@amlkiller 同一台机器 有何区别。。。

docker的127是虚拟机的本机,不是物理机的127。去看下docker的文档吧。你找个教程跟着做。

简单来说。就是目前的nextchat 确实有问题,在不绑定任何配置的情况下 同一台机器,依旧可以使用 而且在使用期间,使用的模型会自动变动,这些是问题而已。不在乎什么是不是127.0.0.1 能连接不就可以了么。有什么问题

我语气不好致歉,但我确实认为这是运维技术问题。 BASE_URL 是在不自定义接口的情况 ,使用提供方服务使用的,你填写localhost就等于问虚拟机自己要,这对于你同一台当然没问题,但也可以说明你docker也有配置问题,ollama和nextchat应该是分开的两个dockers,nextchat的localhost地址不应该能请求到ollama。

BASE_URL=http://new-api:3000
OPENAI_API_KEY=sk-ZL8ndYKNiBspyQXqaX5pkZ0XcLMhvppElvMRYpTvhSJN2A8k
CODE=88888888

我拉取的镜像,正常跑起来后,我没有使用BASE_URL,但还是可以访问ollama对应的服务。我其实很疑惑项目运行使用的配置。

Image 比如我现在没有配置BASE_URL,但是对话功能可以正常使用。我不太清楚原因

这是前端自定义接口 baseurl是环境变量,真的去看下docker文档和运维吧。

Image 我的感觉是 这些配置信息应该有什么问题。 我在不配置BASE_URL的情况下,如上图所示,没有配置自定义地址,但是还是可以访问到本地的ollma,如下图所示,是不是很疑惑 Image

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@amlkiller What is the difference between the same machine. . .

docker's 127 is the native of the virtual machine, not the 127 of the physical machine. Go and read the docker documentation. You find a tutorial to follow.

In short. There is indeed a problem with the current nextchat. The same machine can still be used without binding any configuration. And during use, the model used will automatically change. These are just problems. Don't care if 127.0.0.1 is enough to connect. What's the problem

I apologize for a bad tone, but I do think it is an operation and maintenance technical issue. BASE_URL is used without a custom interface. Use the provider service. If you fill in localhost, it is equivalent to asking the virtual machine to ask the virtual machine for the same machine. Of course, it is OK for you, but it can also indicate that your docker also has configuration problems. Ollama and nextchat should be two separate dockers, and the localhost address of nextchat should not be able to request ollama.

BASE_URL=http://new-api:3000
OPENAI_API_KEY=sk-ZL8ndYKNiBspyQXqaX5pkZ0XcLMhvppElvMRYpTvhSJN2A8k
CODE=888888888

I pulled the image. After running normally, I did not use BASE_URL, but I could still access the corresponding service of ollama. I'm actually very confused about the configuration used by the project.

Image For example, I have not configured BASE_URL now, but the dialogue function can be used normally. I don't know the reason

This is a front-end custom interface. Baseurl is an environment variable. If you really want to check the docker documentation and operation and maintenance.

Image My feeling is that there should be something wrong with these configuration information. Without configuring BASE_URL, as shown in the above figure, I did not configure a custom address, but I could still access the local ollma, as shown in the following figure. Are you very confused? Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants