LCORE-1285: update llama stack to 0.5.0#1112
LCORE-1285: update llama stack to 0.5.0#1112jrobertboos wants to merge 6 commits intolightspeed-core:mainfrom
Conversation
…api; adjust constants and tests accordingly
WalkthroughThis PR upgrades Llama Stack dependencies to version 0.5.0 across project configuration and constants, and updates corresponding test expectations. Additionally, the text extraction utility is enhanced to handle plain string elements within content sequences. Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~8 minutes Possibly related PRs
Suggested reviewers
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@pyproject.toml`:
- Around line 31-33: Update the mismatched dependency for llama-stack-api:
replace "llama-stack-api==0.5.0" with the latest published version
"llama-stack-api==0.4.3" (or align all three to a consistent, released version)
so installations won't fail; locate the dependency entry for llama-stack-api in
the pyproject.toml dependency list and change the version string accordingly.
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/utils/responses.py`:
- Around line 1082-1084: The code currently appends part.strip() to
text_fragments unconditionally inside the branch that checks isinstance(part,
str); change this to only append when the stripped string is non-empty to avoid
inserting empty fragments from whitespace-only parts—update the branch handling
the variables part and text_fragments (where the if isinstance(part, str): ...
continue logic appears) so that you compute stripped = part.strip() and append
stripped only if stripped is truthy.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (4)
pyproject.tomlsrc/constants.pysrc/utils/responses.pytests/e2e/features/info.feature
🚧 Files skipped from review as they are similar to previous changes (2)
- tests/e2e/features/info.feature
- pyproject.toml
| if isinstance(part, str): | ||
| text_fragments.append(part.strip()) | ||
| continue |
There was a problem hiding this comment.
Avoid appending empty fragments from whitespace-only string parts.
Line 1083 appends part.strip() unconditionally. If part is whitespace-only, this adds "" and can create spacing artifacts in the final joined text.
Suggested fix
- if isinstance(part, str):
- text_fragments.append(part.strip())
+ if isinstance(part, str):
+ stripped_part = part.strip()
+ if stripped_part:
+ text_fragments.append(stripped_part)
continue🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/utils/responses.py` around lines 1082 - 1084, The code currently appends
part.strip() to text_fragments unconditionally inside the branch that checks
isinstance(part, str); change this to only append when the stripped string is
non-empty to avoid inserting empty fragments from whitespace-only parts—update
the branch handling the variables part and text_fragments (where the if
isinstance(part, str): ... continue logic appears) so that you compute stripped
= part.strip() and append stripped only if stripped is truthy.
|
|
||
| text_fragments: list[str] = [] | ||
| for part in content: | ||
| if isinstance(part, str): |
There was a problem hiding this comment.
This is not necessary. Function extracts text parts from OpenResponses input/output which is either plain string or sequence of response items i.e., never sequence of plain strings.
| # Minimal and maximal supported Llama Stack version | ||
| MINIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.17" | ||
| MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.4.3" | ||
| MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.5.0" |
Description
Updated Llama Stack to 0.5.0 in order to enable the network configuration on providers so that TLS and Proxy support can be added.
Type of change
Tools used to create PR
Identify any AI code assistants used in this PR (for transparency and review context)
Related Tickets & Documents
Checklist before requesting a review
Testing
Summary by CodeRabbit
Release Notes
Chores
Bug Fixes