lobehub/lobe-chat
The issue has been closed
[Request] Please add option to create more than one open ai model from settings because if we use litellm then we will require more than one openai model option as we will be using litellm to create multiple local proxy server which will support different llm. #1515
Greatz08 posted onGitHub