Skip to content

LCORE-1285: update llama stack to 0.5.0#1112

Open
jrobertboos wants to merge 6 commits intolightspeed-core:mainfrom
jrobertboos:lcore-1285
Open

LCORE-1285: update llama stack to 0.5.0#1112
jrobertboos wants to merge 6 commits intolightspeed-core:mainfrom
jrobertboos:lcore-1285

Conversation

@jrobertboos
Copy link
Contributor

@jrobertboos jrobertboos commented Feb 6, 2026

Description

Updated Llama Stack to 0.5.0 in order to enable the network configuration on providers so that TLS and Proxy support can be added.

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up service version
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change
  • Unit tests improvement
  • Integration tests improvement
  • End to end tests improvement
  • Benchmarks improvement

Tools used to create PR

Identify any AI code assistants used in this PR (for transparency and review context)

  • Assisted-by: N/A
  • Generated by: N/A

Related Tickets & Documents

Checklist before requesting a review

  • I have performed a self-review of my code.
  • PR has passed all pre-merge test jobs.
  • If it is a core feature, I have added thorough tests.

Testing

  • Please provide detailed steps to perform tests related to this code change.
  • How were the fix/results from this change verified? Please provide relevant screenshots or results.

Summary by CodeRabbit

Release Notes

  • Chores

    • Updated Llama Stack dependencies to version 0.5.0.
  • Bug Fixes

    • Improved text extraction to properly handle plain text content alongside structured content elements.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 6, 2026

Walkthrough

This PR upgrades Llama Stack dependencies to version 0.5.0 across project configuration and constants, and updates corresponding test expectations. Additionally, the text extraction utility is enhanced to handle plain string elements within content sequences.

Changes

Cohort / File(s) Summary
Dependency Version Updates
pyproject.toml, src/constants.py, tests/e2e/features/info.feature
Bumps llama-stack, llama-stack-client, and llama-stack-api from versions 0.4.3/0.4.4 to 0.5.0; updates MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION constant and test expectations accordingly.
Text Extraction Enhancement
src/utils/responses.py
Adds branch in _extract_text_from_content to handle plain string elements by stripping and appending to text_fragments before type-based processing, ensuring raw strings are included in text extraction.

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~8 minutes

Possibly related PRs

Suggested reviewers

  • tisnik
  • radofuchs
🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title directly and clearly summarizes the primary change—updating Llama Stack to version 0.5.0, which is the main objective across all modified files.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In `@pyproject.toml`:
- Around line 31-33: Update the mismatched dependency for llama-stack-api:
replace "llama-stack-api==0.5.0" with the latest published version
"llama-stack-api==0.4.3" (or align all three to a consistent, released version)
so installations won't fail; locate the dependency entry for llama-stack-api in
the pyproject.toml dependency list and change the version string accordingly.

@jrobertboos jrobertboos marked this pull request as draft February 9, 2026 20:13
@jrobertboos jrobertboos marked this pull request as ready for review March 3, 2026 18:09
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/utils/responses.py`:
- Around line 1082-1084: The code currently appends part.strip() to
text_fragments unconditionally inside the branch that checks isinstance(part,
str); change this to only append when the stripped string is non-empty to avoid
inserting empty fragments from whitespace-only parts—update the branch handling
the variables part and text_fragments (where the if isinstance(part, str): ...
continue logic appears) so that you compute stripped = part.strip() and append
stripped only if stripped is truthy.

ℹ️ Review info

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2e001ab and 60efb35.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (4)
  • pyproject.toml
  • src/constants.py
  • src/utils/responses.py
  • tests/e2e/features/info.feature
🚧 Files skipped from review as they are similar to previous changes (2)
  • tests/e2e/features/info.feature
  • pyproject.toml

Comment on lines +1082 to +1084
if isinstance(part, str):
text_fragments.append(part.strip())
continue
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Avoid appending empty fragments from whitespace-only string parts.

Line 1083 appends part.strip() unconditionally. If part is whitespace-only, this adds "" and can create spacing artifacts in the final joined text.

Suggested fix
-        if isinstance(part, str):
-            text_fragments.append(part.strip())
+        if isinstance(part, str):
+            stripped_part = part.strip()
+            if stripped_part:
+                text_fragments.append(stripped_part)
             continue
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/utils/responses.py` around lines 1082 - 1084, The code currently appends
part.strip() to text_fragments unconditionally inside the branch that checks
isinstance(part, str); change this to only append when the stripped string is
non-empty to avoid inserting empty fragments from whitespace-only parts—update
the branch handling the variables part and text_fragments (where the if
isinstance(part, str): ... continue logic appears) so that you compute stripped
= part.strip() and append stripped only if stripped is truthy.


text_fragments: list[str] = []
for part in content:
if isinstance(part, str):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not necessary. Function extracts text parts from OpenResponses input/output which is either plain string or sequence of response items i.e., never sequence of plain strings.

# Minimal and maximal supported Llama Stack version
MINIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.17"
MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.4.3"
MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.5.0"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

0.5.1 is already out

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants