You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is environment variable OPENAI_SERVED_MODEL_NAME_OVERRIDE that is used to change loaded model name. I have tested this option and found out that it does work only for version 0.6.1 while 0.6.2 does not change model name and 0.6.3 and 0.6.4 can not even start the inference server
I have to use older version which kind of less efficient in order to be able to use custom model name and incapsulate organization and model name to some common persistent name for ease for integration. This is really annoying hope you can figure out way to correct that bug it basically blocks users who need custom model name from using the most fresh versions of inference server
The text was updated successfully, but these errors were encountered:
There is environment variable
OPENAI_SERVED_MODEL_NAME_OVERRIDE
that is used to change loaded model name. I have tested this option and found out that it does work only for version0.6.1
while0.6.2
does not change model name and0.6.3
and0.6.4
can not even start the inference serverI have to use older version which kind of less efficient in order to be able to use custom model name and incapsulate organization and model name to some common persistent name for ease for integration. This is really annoying hope you can figure out way to correct that bug it basically blocks users who need custom model name from using the most fresh versions of inference server
The text was updated successfully, but these errors were encountered: