lobehub/lobe-chat

Do you want to work on this issue?

You can request for a bounty in order to promote it!

[Request] Please add option to create more than one open ai model from settings because if we use litellm then we will require more than one openai model option as we will be using litellm to create multiple local proxy server which will support different llm. #1515

HakaishinShwet posted onGitHub

Loading interface...
Loading interface...
Loading interface...

Fund this Issue

$0.00
Funded
Only logged in users can fund an issue

Pull requests