lobehub/lobe-chat

[Request] Please add option to create more than one open ai model from settings because if we use litellm then we will require more than one openai model option as we will be using litellm to create multiple local proxy server which will support different llm. #1515

Greatz08 posted onGitHub

Loading interface...
Loading interface...
Loading interface...
Loading interface...
Loading interface...

Fund this Issue

$0.00
Funded

Pull requests