Llama-stack API key in configuration#21
Conversation
| @@ -1 +1,4 @@ | |||
| name: foo bar baz | |||
| llama_stack: | |||
| url: http://localhost:8321 | |||
There was a problem hiding this comment.
Are you planning on running llama-stack (server) separately to your own uvicorn instance?
The default server launched with llama-stack is here.
I suspect you could use Meta's code as a starting point to integrate llama-stack itself into your code?
There was a problem hiding this comment.
Hi @manstis, it needs to be decided, but current approach is to run llama-stack as a separate image, the service will call it via LlamaStackClient object. Do you think it's doable this way?
There was a problem hiding this comment.
Hi @tisnik running llama-stack as a separate image may be difficult.
llama-stack will take some (IDK how much) configuration from lightspeed-stack; providers, models etc. Your distribution would then need to effectively manage two containers; one for lightspeed-stack and one for llama-stack.
You may want to look at this too.
i.e. lightspeed-stack is the server but uses llama-stack as a library internally.
There was a problem hiding this comment.
That one looks reasonable, thank you @manstis !
There was a problem hiding this comment.
Tested it right now, definitely possible solution, yes.
Description
Llama-stack API key in configuration
Type of change