feat: add scope-aware framework bootstrap#5
Conversation
📝 WalkthroughWalkthroughThe PR moves Super OpenCode runtime and hook implementations from Changes
Sequence DiagramsequenceDiagram
participant User as CLI/TUI User
participant Engine as Framework Engine
participant Manifest as Manifest Loader/State
participant Assets as Asset Sync
participant Config as Config Patcher
User->>Engine: installFramework(options)
Engine->>Manifest: loadFrameworkManifest()
Manifest-->>Engine: FrameworkManifest
Engine->>Manifest: readInstallState(scopeStatePath)
Manifest-->>Engine: FrameworkInstallState|undefined
Engine->>Assets: enumerate & hash assetGroups
Assets-->>Engine: files + hashes
Engine->>Assets: syncAssets(sources → targets)
Assets-->>Engine: ReportItems (installed/updated/conflict)
Engine->>Config: patchOpencodeConfig(...) / patchTuiConfig(...)
Config-->>Engine: ConfigPatchResult
Engine->>Manifest: writeInstallState(updatedState)
Manifest-->>Engine: persisted
Engine-->>User: FrameworkReport (items, MCP diagnostics, restartRequired)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Review Summary by QodoAdd scope-aware framework bootstrap engine with TUI and CLI targets
WalkthroughsDescription• Implement manifest-driven bootstrap engine for scope-aware framework installation - Supports explicit global and project scope install, status, update, uninstall flows - Idempotent asset sync with managed file hashes and conflict detection - MCP diagnostics and prerequisite validation • Add TUI and CLI targets for framework bootstrap with explicit scope selection - TUI plugin provides Plugin Manager visibility and first-load prompting - CLI wrapper delegates to shared bootstrap engine • Refactor runtime plugin to server target and consolidate hook implementations - Move persistence, command, and compaction hooks to src/runtime/hooks.ts - Add duplicate-load protection to prevent hook registration conflicts • Expand test coverage to validate clean installs, updates, conflicts, and uninstalls - Test global and project scope isolation and deduplication - Test MCP prerequisite diagnostics and state transitions Diagramflowchart LR
A["User installs package"] --> B["TUI or CLI bootstrap"]
B --> C["Select scope: global or project"]
C --> D["Load framework manifest"]
D --> E["Sync assets and config"]
E --> F["Diagnose MCP prerequisites"]
F --> G["Write install state"]
G --> H["Report results"]
H --> I["Restart OpenCode if needed"]
File Changes1. src/framework/engine.ts
|
Code Review by Qodo
1.
|
| const mergedValue = currentValue ? mergeObjects(diagnostic.config, currentValue) : { ...diagnostic.config } | ||
| mergedValue.enabled = diagnostic.enabled |
There was a problem hiding this comment.
🚩 MCP config merge gives user values priority over framework updates
At src/framework/config.ts:195, mergeObjects(diagnostic.config, currentValue) uses the framework's MCP config as base and the user's existing config as override. This means non-object scalar fields like command arrays, type, url, and headers values from the user's existing config will always win over framework updates. Only the enabled field is explicitly overridden afterwards at line 196. This design preserves user customizations but also means framework MCP config updates (e.g., changing a Serena command URL or updating a package version in a command array) will never propagate to users who already have existing MCP config entries. This may or may not be the intended tradeoff — worth documenting explicitly.
Was this helpful? React with 👍 or 👎 to provide feedback.
| let cachedManifestPromise: Promise<FrameworkManifest> | undefined | ||
|
|
||
| export async function loadFrameworkManifest(): Promise<FrameworkManifest> { | ||
| if (cachedManifestPromise) { | ||
| return cachedManifestPromise | ||
| } | ||
|
|
||
| cachedManifestPromise = (async () => { | ||
| try { | ||
| const packageRoot = findPackageRoot(import.meta.url) | ||
| const manifestPath = path.join(packageRoot, "framework.manifest.json") | ||
| const rawManifest = await readFile(manifestPath, "utf8") | ||
| return JSON.parse(rawManifest) as FrameworkManifest | ||
| } catch (error) { | ||
| cachedManifestPromise = undefined | ||
| throw error | ||
| } | ||
| })() | ||
|
|
||
| return cachedManifestPromise |
There was a problem hiding this comment.
📝 Info: Manifest caching retains promise indefinitely after first successful load
In src/framework/manifest.ts:7-26, cachedManifestPromise is set on first call and only cleared on error. For the CLI this is fine (short-lived process), but the TUI plugin is long-lived. If the user upgrades the npm package while OpenCode is running, the cached manifest would still point to the old version. The TUI would need an OpenCode restart to pick up manifest changes. This is acceptable given the docs say 'Restart OpenCode' after updates, but worth being aware of.
Was this helpful? React with 👍 or 👎 to provide feedback.
|
@qodo-code-review /review |
|
/review |
|
Persistent review updated to latest commit df3c97e |
|
/review |
|
Persistent review updated to latest commit 65cac00 |
| import { autoCheckpointHint, commandPersistenceHint, persistenceContract } from "./memory.js" | ||
|
|
||
| const persistenceCommands = new Set(["sc-pm", "sc-save", "sc-load", "sc-reflect"]) | ||
| const checkpointCommands = new Set(["sc-implement", "sc-build", "sc-test", "sc-document", "sc-task"]) |
There was a problem hiding this comment.
🚩 Checkpoint commands set changed significantly from the old code
The old checkpointCommands set in .opencode/plugins/super-opencode/commands.ts was ['sc-save', 'sc-spawn', 'sc-workflow'] with an explicit normalized !== 'sc-save' exclusion (so effectively sc-spawn and sc-workflow received checkpoint hints). The new set in src/runtime/hooks.ts:6 is ["sc-implement", "sc-build", "sc-test", "sc-document", "sc-task"] — a completely different set of commands. This is a deliberate behavioral change, not a mechanical transformation. The old commands sc-spawn and sc-workflow no longer receive checkpoint hints, while several execution-oriented commands now do. This seems intentional (checkpoint before long-running execution, not before planning), but reviewers should confirm the new set matches the desired user-facing behavior.
Was this helpful? React with 👍 or 👎 to provide feedback.
| await syncAssets({ | ||
| groups: manifest.assetGroups, | ||
| packageRoot: paths.packageRoot, | ||
| scope: options.scope, | ||
| configDir: paths.configDir, | ||
| projectRoot: paths.projectRoot, | ||
| stateFiles: state.files, | ||
| reportItems: report.items, | ||
| force: false, | ||
| dryRun: true, | ||
| }) |
There was a problem hiding this comment.
📝 Info: statusFramework mutates in-memory state.files during dry-run but never persists
In statusFramework (src/framework/engine.ts:335-372), syncAssets is called with dryRun: true but state.files is passed as the mutable stateFiles parameter. The sync logic still writes to stateFiles in the content-matching branch (engine.ts lines 159-174) even during dry run — for example, recording files as 'adopted'. Since statusFramework never calls writeInstallState, these in-memory mutations are harmless and discarded. This was considered as a potential bug but dismissed because: (1) each file gets a unique key so mutations don't cross-contaminate, and (2) the state is never persisted.
Was this helpful? React with 👍 or 👎 to provide feedback.
| const mergedValue = currentValue ? mergeObjects(diagnostic.config, currentValue) : { ...diagnostic.config } | ||
| mergedValue.enabled = diagnostic.enabled |
There was a problem hiding this comment.
📝 Info: MCP merge logic treats enabled field as framework-authoritative even for user-modified entries
In patchOpencodeConfig at src/framework/config.ts:308, mergedValue.enabled = diagnostic.enabled unconditionally overwrites the enabled field based on prerequisite diagnostics, even for user-modified MCP entries. This means a user cannot manually force-enable an MCP entry that the framework considers disabled (e.g., setting enabled: true when CONTEXT7_API_KEY is absent). The comment at line 307 explicitly calls this out as intentional: prerequisite diagnostics are always authoritative for runtime enablement. This is a reasonable safety measure (enabling an MCP without its prerequisites would cause runtime errors), but users who want to override this behavior would need to add the MCP entry under a different key name.
Was this helpful? React with 👍 or 👎 to provide feedback.
| try { | ||
| await rename(tempPath, filePath) | ||
| } catch (error) { | ||
| if (!["EEXIST", "EPERM"].includes((error as NodeJS.ErrnoException).code ?? "")) { | ||
| await rm(tempPath, { force: true }) | ||
| throw error | ||
| } | ||
|
|
||
| await rm(filePath, { force: true }) | ||
| await rename(tempPath, filePath) | ||
| } |
There was a problem hiding this comment.
📝 Info: writeInstallState rename fallback leaves temp file on second rename failure
In src/framework/state.ts:46-56, if the initial rename(tempPath, filePath) fails with EEXIST or EPERM (Windows), the fallback deletes the target and retries. If the second rename also fails (e.g., another process recreated the file), the temp file is leaked. This is a very unlikely race condition for an interactive install tool that runs one operation at a time, so it was not flagged as a bug. The existing error handling is reasonable for the expected use case.
Was this helpful? React with 👍 or 👎 to provide feedback.
| continue | ||
| } | ||
|
|
||
| const relativePath = prefix ? path.posix.join(prefix, entry.name) : entry.name |
There was a problem hiding this comment.
📝 Info: Cross-platform path handling is carefully maintained
listRelativeFiles at src/framework/engine.ts:269-290 uses path.posix.join for building relative paths (ensuring forward slashes in state keys) while using path.join for filesystem operations (native separators). The destination key normalization at line 384 (split(path.sep).join('/')) ensures state file keys are always forward-slash-separated regardless of platform. During uninstall, path.join(targetBase, relativePath) at line 688 correctly converts forward-slash keys back to native paths for filesystem operations. This cross-platform handling is consistent throughout.
Was this helpful? React with 👍 or 👎 to provide feedback.
|
/review |
|
Persistent review updated to latest commit 70ec485 |
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
|
/review |
|
Persistent review updated to latest commit 70ec485 |
|
/review |
✅ Actions performedReview triggered.
|
| if (runtimeState[runtimeLoadMarker]) { | ||
| await client.app.log({ | ||
| body: { | ||
| service: "super-opencode", | ||
| level: "info", | ||
| message: "Super OpenCode runtime already active, skipping duplicate hook registration", | ||
| }, | ||
| }) | ||
|
|
||
| return createCompactionHooks(worktree) |
There was a problem hiding this comment.
📝 Info: Duplicate load protection returns compaction hooks only on second registration
When SuperOpenCodePlugin at src/runtime/plugin.ts:20-31 detects a duplicate registration via the global Symbol.for marker, it only returns createCompactionHooks(worktree), omitting system and command hooks. This means the second plugin instance contributes compaction context (which is worktree-specific) but does not contribute system or command hooks (avoiding duplication). The test at tests/plugin-hooks.test.mjs:68-70 confirms this: second['experimental.chat.system.transform'] is undefined. This is intentional design for the dual-registration scenario where both npm and local plugin copies coexist.
Was this helpful? React with 👍 or 👎 to provide feedback.
| if (fileState.origin === "adopted" && !(options.force ?? false)) { | ||
| report.items.push({ | ||
| kind: "asset", | ||
| name: relativePath, | ||
| status: "skipped", | ||
| detail: "File is unmanaged (adopted) and will not be removed unless --force is used.", | ||
| }) | ||
| continue | ||
| } |
There was a problem hiding this comment.
📝 Info: Adopted files are forgotten from state on partial uninstall with conflicts
In uninstallFramework at src/framework/engine.ts:697-704, adopted files are skipped without being added to remainingFiles. If the uninstall hits a conflict with a different managed file (line 707-715), the early return at line 724 persists state with only the conflicting files — adopted files vanish from the tracked state. This is benign because adopted files are never deleted by uninstall (they're skipped), so losing their tracking is harmless. On a subsequent clean uninstall, the adopted files simply won't appear in any report.
Was this helpful? React with 👍 or 👎 to provide feedback.
| if (policy.requirements?.manual) { | ||
| status = "configured but requires auth/manual setup" | ||
| enabled = false | ||
| } else if (missingEnv.length > 0) { | ||
| status = "configured but disabled by missing env" | ||
| enabled = false | ||
| } else if (missingBinaries.length > 0) { | ||
| status = "configured but disabled by missing binary" | ||
| enabled = false | ||
| } |
There was a problem hiding this comment.
📝 Info: MCP diagnostic priority may report env issues before binary issues
In diagnoseMcpPolicies (src/framework/prerequisites.ts:73-82), the priority order is: manual requirements → missing env → missing binaries. For MCPs like tavily that require both npx (binary) and TAVILY_API_KEY (env), if both are missing, only the env issue is reported in the status. The user would need to fix the env first, then see the binary issue on the next run. Both missingEnv and missingBinaries arrays are always fully populated regardless of status, so programmatic consumers have complete information. This is a minor UX consideration, not a correctness issue.
Was this helpful? React with 👍 or 👎 to provide feedback.
| async function writeJson(filePath: string, value: JsonObject): Promise<void> { | ||
| await mkdir(path.dirname(filePath), { recursive: true }) | ||
|
|
||
| let originalText: string | undefined | ||
| try { | ||
| originalText = await readFile(filePath, "utf8") | ||
| } catch (error) { | ||
| if ((error as NodeJS.ErrnoException).code !== "ENOENT") { | ||
| throw error | ||
| } | ||
| } | ||
|
|
||
| if (originalText === undefined) { | ||
| await writeFile(filePath, `${JSON.stringify(value, null, 2)}\n`, "utf8") | ||
| return | ||
| } | ||
|
|
||
| const parsed = parseJsoncObject(originalText, filePath) | ||
| const updatedText = applyJsoncObjectEdits(originalText, parsed, value) | ||
| await writeFile(filePath, updatedText.endsWith("\n") ? updatedText : `${updatedText}\n`, "utf8") | ||
| } |
There was a problem hiding this comment.
📝 Info: writeJson re-reads the file even though the caller already parsed it
Both patchOpencodeConfig and removeFrameworkConfig read and parse the config file via readJsoncObject, then later call writeJson which reads the same file again to preserve JSONC comments during edits. This double-read is intentional: the first read extracts the semantic JSON object for programmatic manipulation, while the second read in writeJson retrieves the raw JSONC text to apply comment-preserving edits via jsonc-parser's modify/applyEdits. There's a theoretical TOCTOU window between the two reads, but since the bootstrap is single-threaded and no concurrent modifications are expected, this is acceptable. A minor optimization would be to thread the raw text through, but the current approach is clearer.
Was this helpful? React with 👍 or 👎 to provide feedback.
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
|
/review |
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
|
/review |
| const { created, value } = await readJsoncObject(options.filePath) | ||
| const config: JsonObject = { ...value } | ||
| let changed = created | ||
| const addedMcpHashes: Record<string, string> = { ...(options.state?.ownership.addedMcpHashes ?? {}) } |
There was a problem hiding this comment.
📝 Info: addedMcpHashes accumulates stale entries for removed manifest MCPs
In patchOpencodeConfig at src/framework/config.ts:266, addedMcpHashes is initialized by copying ALL previously recorded hashes from state. The loop only processes MCP entries present in the current manifest diagnostics. If a future manifest version removes an MCP entry (e.g., morph), the old hash for that entry persists in addedMcpHashes indefinitely. During uninstall, removeFrameworkConfig would still try to clean it up (which is good), but the state file grows with defunct entries over time. Not a correctness bug, but a minor state hygiene concern.
Was this helpful? React with 👍 or 👎 to provide feedback.
| for (const diagnostic of diagnostics) { | ||
| report.items.push({ | ||
| kind: "mcp", | ||
| name: diagnostic.name, | ||
| status: diagnostic.status, | ||
| detail: diagnostic.detail, | ||
| }) | ||
| } |
There was a problem hiding this comment.
📝 Info: MCP diagnostics appear in both report.items and report.mcp
In installFramework (src/framework/engine.ts:555 and 607-614), MCP diagnostics are stored in both report.mcp and pushed as items into report.items with kind: "mcp". The CLI renderReport in src/cli.ts:42 correctly handles this by filtering report.items to exclude MCP items and rendering report.mcp separately. The TUI summarizeReport in src/tui.ts:13-14 filters on status strings rather than kind. Both approaches work, but the dual storage pattern is subtle — a consumer of FrameworkReport who doesn't know about this could double-count MCP entries.
Was this helpful? React with 👍 or 👎 to provide feedback.
| async function isExecutable(filePath: string): Promise<boolean> { | ||
| try { | ||
| await access(filePath, constants.X_OK) | ||
| return true | ||
| } catch { | ||
| return false | ||
| } | ||
| } |
There was a problem hiding this comment.
📝 Info: Cross-platform binary resolution on Windows uses X_OK which is aliased to F_OK
In src/framework/prerequisites.ts:11, isExecutable checks constants.X_OK. On Windows, Node.js maps X_OK to F_OK (file existence check), so any existing file in PATH would be considered executable regardless of actual permissions. The code also lowercases PATHEXT extensions at line 35, which is fine since Windows is case-insensitive. This Windows behavior is a known Node.js limitation, not a bug in this code, but it means binary resolution on Windows is effectively "does the file exist in PATH" rather than "is the file executable."
Was this helpful? React with 👍 or 👎 to provide feedback.
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@PROJECT_INDEX.md`:
- Line 48: Update the user-facing occurrences of "github" to "GitHub" in
PROJECT_INDEX.md; specifically replace the path/label string
`.github/workflows/` and the other listed entries (lines referencing "github" at
the mentioned locations) so the platform name is capitalized consistently, e.g.,
change `.github/workflows/` to `.github/workflows/` in display but ensure any
prose or label that reads "github" becomes "GitHub" (search for the lowercase
"github" tokens and update them in the same file).
In `@src/framework/config.ts`:
- Around line 448-466: The uninstall code currently calls
ensureArray(config.plugin) and ensureArray(config.instructions) which silently
coerces malformed user-owned values (objects/strings) into [] and causes
unexpected deletions; change these branches (the blocks using ensureArray with
removePluginSpecFromEntries and removeManagedInstructionEntries, keyed by
options.state.ownership.addedOpencodePlugin and
options.state.ownership.addedInstructions) to first validate the existing shapes
by using the same validation used in readArrayConfigValue() (or equivalent
type-checking) and if the value is not a proper array, surface a conflict/error
instead of proceeding, leaving the original config untouched and not toggling
changed; only call removePluginSpecFromEntries/removeManagedInstructionEntries
when the value is confirmed to be an array.
- Around line 337-340: The code currently treats pre-existing mcp.<name> entries
as "added" when bootstrap modifies them (using currentValue,
wasPreviouslyManaged, entryChanged), which can cause removeFrameworkConfig() to
delete user-owned data; change the logic so addedMcpKeys/addedMcpHashes are only
populated for keys the framework actually created from scratch (i.e., when
currentValue === undefined) instead of whenever an existing key was changed;
update the condition around addedMcpKeys.push(diagnostic.name) and
addedMcpHashes[diagnostic.name] = hashJsonValue(mergedValue) to require
currentValue === undefined, or alternatively extend FrameworkInstallState to
persist the original value (store originalValue keyed by diagnostic.name) and
write/read that field so removeFrameworkConfig() can restore originals—use the
simpler preferrable fix of restricting addedMcpKeys to newly created keys unless
you also update FrameworkInstallState and removeFrameworkConfig() to support
full rollback.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: cb9c5070-1520-4868-8698-4b7a7196ef83
📒 Files selected for processing (15)
.gitignore.npmignoreARCHITECTURE.mdCOMMANDS.mdPROJECT_INDEX.jsonPROJECT_INDEX.mdREADME.mdsrc/cli.tssrc/framework/config.tssrc/framework/engine.tssrc/framework/state.tssrc/framework/types.tssrc/runtime/plugin.tstests/framework.test.mjstests/plugin-hooks.test.mjs
✅ Files skipped from review due to trivial changes (6)
- .gitignore
- .npmignore
- COMMANDS.md
- PROJECT_INDEX.json
- tests/framework.test.mjs
- src/framework/types.ts
🚧 Files skipped from review as they are similar to previous changes (4)
- tests/plugin-hooks.test.mjs
- src/cli.ts
- src/framework/state.ts
- src/framework/engine.ts
|
|
||
| | Path | Kind | Notes | | ||
| |---|---|---| | ||
| | `.github/workflows/` | automation | CI and npm publish workflows | |
There was a problem hiding this comment.
Capitalize GitHub consistently in user-facing docs.
These entries use github in prose/path labels. The platform name should be GitHub.
Also applies to: 112-113, 119-120, 145-145
🧰 Tools
🪛 LanguageTool
[uncategorized] ~48-~48: The official name of this software platform is spelled with a capital “H”.
Context: ...| Path | Kind | Notes | |---|---|---| | .github/workflows/ | automation | CI and npm p...
(GITHUB)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@PROJECT_INDEX.md` at line 48, Update the user-facing occurrences of "github"
to "GitHub" in PROJECT_INDEX.md; specifically replace the path/label string
`.github/workflows/` and the other listed entries (lines referencing "github" at
the mentioned locations) so the platform name is capitalized consistently, e.g.,
change `.github/workflows/` to `.github/workflows/` in display but ensure any
prose or label that reads "github" becomes "GitHub" (search for the lowercase
"github" tokens and update them in the same file).
There was a problem hiding this comment.
🧹 Nitpick comments (6)
tests/framework.test.mjs (2)
940-963: Potential test environment issue with emptyPATH.Setting
PATH: ''to simulate missing binaries may cause unexpected failures if the test framework or Node.js internals rely on PATH-resolved utilities. Consider using a non-existent path instead.💡 Suggested improvement
env: { - PATH: '', + PATH: '/nonexistent-path-for-test', PATHEXT: process.env.PATHEXT ?? '.EXE;.CMD;.BAT;.COM',🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/framework.test.mjs` around lines 940 - 963, The test 'marks binary-backed MCPs disabled when binaries are unavailable' constructs an env with PATH: '' which can unpredictably break Node or test tooling; update the test to set PATH to a guaranteed-nonexistent directory (e.g., '/no-such-path' or platform-appropriate bogus path) when calling statusFramework({ env: { PATH, ... } }) so binaries cannot be found but the runtime's PATH handling remains stable, leaving PATHEXT and other keys unchanged.
57-59: Consider adding error handling for build failures inbeforeAll.If
bun run buildfails, subsequent tests will run against stale/missing artifacts with confusing errors. The current setup silently proceeds. Consider logging the failure or re-throwing with context.💡 Suggested improvement
beforeAll(async () => { - await execFileAsync('bun', ['run', 'build'], { cwd: projectRoot, env: process.env, timeout: 60000 }) + try { + await execFileAsync('bun', ['run', 'build'], { cwd: projectRoot, env: process.env, timeout: 60000 }) + } catch (error) { + console.error('Build failed:', error.stderr || error.message) + throw error + } }, 120000)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/framework.test.mjs` around lines 57 - 59, The beforeAll hook calling execFileAsync('bun', ['run', 'build'], { cwd: projectRoot, env: process.env, timeout: 60000 }) should handle failures explicitly: wrap the await call in a try/catch inside beforeAll, log the error (including stdout/stderr if available) and re-throw or throw a new Error with context so the test run fails fast instead of proceeding with stale/missing artifacts; update the beforeAll block to catch errors from execFileAsync and surface them with a descriptive message referencing the build step.src/framework/config.ts (2)
119-140: Recursive JSONC editing may accumulate formatting drift.The
applyJsoncObjectEditsfunction applies edits key-by-key recursively. Eachmodify+applyEditscall regenerates the text, which could introduce subtle formatting changes over multiple nested edits. While functionally correct, deeply nested configs may see whitespace drift over repeated operations.This is acceptable for the current use case but worth noting if config formatting stability becomes a concern.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/framework/config.ts` around lines 119 - 140, applyJsoncObjectEdits applies modify + applyEdits per key recursively which can cause cumulative formatting drift; change it to gather all TextEdit results (calls to modify) for the full traversal (use applyJsoncObjectEdits to collect edits into an array) and then call applyEdits once on the original sourceText, or alternatively perform a single modify call for the whole object at the top-level, so that modify/applyEdits are not invoked repeatedly; refer to applyJsoncObjectEdits, modify, applyEdits and pathSegments when locating where to switch from per-key apply to batching edits.
502-514: Duplicate file-removal check can be consolidated.Lines 502-509 and 511-514 both check
createdTuiConfig && !hasMeaningfulConfigContent(config)and delete the file. The first branch handles!changedwhile the second handleschanged. Consider consolidating:- if (!changed) { - if (options.state.ownership.createdTuiConfig && !hasMeaningfulConfigContent(config)) { - await rm(options.filePath, { force: true }) - return { changed: true, removedFile: true } - } - - return { changed: false, removedFile: false } - } - - if (options.state.ownership.createdTuiConfig && !hasMeaningfulConfigContent(config)) { + if (options.state.ownership.createdTuiConfig && !hasMeaningfulConfigContent(config)) { await rm(options.filePath, { force: true }) return { changed: true, removedFile: true } } + + if (!changed) { + return { changed: false, removedFile: false } + }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/framework/config.ts` around lines 502 - 514, The duplicate file-removal check should be run once after determining the final `changed` value instead of in both branches; remove the second block and the first duplicate, then after computing `changed` call `if (options.state.ownership.createdTuiConfig && !hasMeaningfulConfigContent(config)) { await rm(options.filePath, { force: true }); return { changed: true, removedFile: true }; }` so that `rm(options.filePath)` and the `{ changed: true, removedFile: true }` return are centralized (references: variable `changed`, `options.state.ownership.createdTuiConfig`, `hasMeaningfulConfigContent`, `rm`, and `options.filePath`).src/framework/jsonc.ts (1)
47-52: Consider handlingundefinedentry values defensively.Line 49 extracts
valuefrom entries but doesn't guard againstundefinedarray elements (e.g., sparse arrays or explicitundefinedentries). Whilevalue === specwould befalseforundefined,value?.startsWith(...)correctly handles it via optional chaining.The logic is functionally correct, but an explicit early
continuefor non-string/non-array entries could improve clarity.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/framework/jsonc.ts` around lines 47 - 52, The hasPluginSpec function currently computes value from entries without explicitly skipping undefined or non-string/non-array items; update hasPluginSpec so it first checks each entry and early-continues (returns false for that iteration) when entry is undefined or not a string and not an array whose first element is a string, then only compute value and compare using value === spec or value.startsWith(`${spec}@`); target the hasPluginSpec function and the local variable value to add this defensive guard for sparse/undefined array elements.src/framework/file-write.ts (1)
11-26: Retry logic introduces a small race window.The fallback path (Lines 19-25) removes the destination file then retries the rename. Between
rm(filePath)andrename(tempPath, filePath), another process could create or lock the file, causing the retry to fail. This is unlikely in typical usage but could occur under concurrent framework operations.Additionally,
EEXISTis unexpected here since POSIXrename()atomically replaces the target. This branch likely only triggers on Windows withEPERM. Consider documenting this Windows-specific behavior or simplifying the condition.💡 Suggested documentation improvement
+ // On Windows, rename can fail with EPERM when the target file is locked. + // EEXIST is included defensively but should not occur on POSIX systems. try { await rename(tempPath, filePath) } catch (error) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/framework/file-write.ts` around lines 11 - 26, The fallback path that does await rm(filePath) then a single retry of await rename(tempPath, filePath) has a race window where another process can create or lock filePath between rm and rename; update the logic in the catch block that handles (error as NodeJS.ErrnoException).code === "EPERM" || "EEXIST" to perform a safe retry loop: after rm(filePath, {force:true}) attempt rename(tempPath, filePath) multiple times with a short exponential backoff and an existence/lock check before each retry (e.g., stat or access on filePath) to avoid immediately colliding with another creator, and limit retries before cleaning up tempPath and rethrowing; also add a comment documenting that EEXIST is rare on POSIX and this branch mainly targets Windows/EPERM behavior so callers understand the platform nuance.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@src/framework/config.ts`:
- Around line 119-140: applyJsoncObjectEdits applies modify + applyEdits per key
recursively which can cause cumulative formatting drift; change it to gather all
TextEdit results (calls to modify) for the full traversal (use
applyJsoncObjectEdits to collect edits into an array) and then call applyEdits
once on the original sourceText, or alternatively perform a single modify call
for the whole object at the top-level, so that modify/applyEdits are not invoked
repeatedly; refer to applyJsoncObjectEdits, modify, applyEdits and pathSegments
when locating where to switch from per-key apply to batching edits.
- Around line 502-514: The duplicate file-removal check should be run once after
determining the final `changed` value instead of in both branches; remove the
second block and the first duplicate, then after computing `changed` call `if
(options.state.ownership.createdTuiConfig &&
!hasMeaningfulConfigContent(config)) { await rm(options.filePath, { force: true
}); return { changed: true, removedFile: true }; }` so that
`rm(options.filePath)` and the `{ changed: true, removedFile: true }` return are
centralized (references: variable `changed`,
`options.state.ownership.createdTuiConfig`, `hasMeaningfulConfigContent`, `rm`,
and `options.filePath`).
In `@src/framework/file-write.ts`:
- Around line 11-26: The fallback path that does await rm(filePath) then a
single retry of await rename(tempPath, filePath) has a race window where another
process can create or lock filePath between rm and rename; update the logic in
the catch block that handles (error as NodeJS.ErrnoException).code === "EPERM"
|| "EEXIST" to perform a safe retry loop: after rm(filePath, {force:true})
attempt rename(tempPath, filePath) multiple times with a short exponential
backoff and an existence/lock check before each retry (e.g., stat or access on
filePath) to avoid immediately colliding with another creator, and limit retries
before cleaning up tempPath and rethrowing; also add a comment documenting that
EEXIST is rare on POSIX and this branch mainly targets Windows/EPERM behavior so
callers understand the platform nuance.
In `@src/framework/jsonc.ts`:
- Around line 47-52: The hasPluginSpec function currently computes value from
entries without explicitly skipping undefined or non-string/non-array items;
update hasPluginSpec so it first checks each entry and early-continues (returns
false for that iteration) when entry is undefined or not a string and not an
array whose first element is a string, then only compute value and compare using
value === spec or value.startsWith(`${spec}@`); target the hasPluginSpec
function and the local variable value to add this defensive guard for
sparse/undefined array elements.
In `@tests/framework.test.mjs`:
- Around line 940-963: The test 'marks binary-backed MCPs disabled when binaries
are unavailable' constructs an env with PATH: '' which can unpredictably break
Node or test tooling; update the test to set PATH to a guaranteed-nonexistent
directory (e.g., '/no-such-path' or platform-appropriate bogus path) when
calling statusFramework({ env: { PATH, ... } }) so binaries cannot be found but
the runtime's PATH handling remains stable, leaving PATHEXT and other keys
unchanged.
- Around line 57-59: The beforeAll hook calling execFileAsync('bun', ['run',
'build'], { cwd: projectRoot, env: process.env, timeout: 60000 }) should handle
failures explicitly: wrap the await call in a try/catch inside beforeAll, log
the error (including stdout/stderr if available) and re-throw or throw a new
Error with context so the test run fails fast instead of proceeding with
stale/missing artifacts; update the beforeAll block to catch errors from
execFileAsync and surface them with a descriptive message referencing the build
step.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 8840580d-37cc-493c-af60-76a0a0f0f72c
📒 Files selected for processing (13)
.gitignore.npmignorescripts/validate-structure.mjssrc/framework/config.tssrc/framework/engine.tssrc/framework/file-write.tssrc/framework/jsonc.tssrc/framework/manifest.tssrc/framework/prerequisites.tssrc/framework/state.tssrc/framework/types.tssrc/runtime/hooks.tstests/framework.test.mjs
✅ Files skipped from review due to trivial changes (4)
- scripts/validate-structure.mjs
- .npmignore
- src/runtime/hooks.ts
- src/framework/types.ts
🚧 Files skipped from review as they are similar to previous changes (5)
- src/framework/manifest.ts
- .gitignore
- src/framework/state.ts
- src/framework/prerequisites.ts
- src/framework/engine.ts
| const wasPreviouslyManaged = options.state?.ownership.addedMcpKeys.includes(diagnostic.name) ?? false | ||
| const previousManagedHash = options.state?.ownership.addedMcpHashes[diagnostic.name] | ||
| const divergedManagedEntry = currentValue !== undefined && previousManagedHash !== undefined && hashJsonValue(currentValue) !== previousManagedHash | ||
| const shouldRefreshManagedEntry = wasPreviouslyManaged && previousManagedHash !== undefined && !divergedManagedEntry | ||
|
|
||
| if (currentValue === undefined) { | ||
| addedMcpKeys.push(diagnostic.name) | ||
| } | ||
|
|
||
| // Managed MCP entries that still match their recorded hash should follow framework defaults on update. | ||
| // Diverged or pre-existing user entries keep their explicit values except for prerequisite-driven enablement. | ||
| const mergedValue = shouldRefreshManagedEntry | ||
| ? mergeObjects({}, diagnostic.config) | ||
| : currentValue | ||
| ? mergeObjects(diagnostic.config, currentValue) | ||
| : { ...diagnostic.config } | ||
|
|
||
| // Prerequisite diagnostics remain authoritative for runtime enablement. | ||
| mergedValue.enabled = diagnostic.enabled | ||
| const entryChanged = currentValue === undefined || !jsonValuesEqual(currentValue, mergedValue) | ||
| if (entryChanged) { | ||
| changed = true | ||
| } | ||
|
|
||
| mergedMcp[diagnostic.name] = mergedValue | ||
|
|
||
| if (currentValue === undefined || (wasPreviouslyManaged && (previousManagedHash === undefined || !divergedManagedEntry))) { | ||
| addedMcpHashes[diagnostic.name] = hashJsonValue(mergedValue) | ||
| } |
There was a problem hiding this comment.
📝 Info: MCP ownership hash tracking preserves diverged entry hashes correctly
In patchOpencodeConfig at src/framework/config.ts:260-288, when a previously managed MCP entry has diverged (user modified it), the old hash is preserved in addedMcpHashes because the update condition on line 286 evaluates to false. This means subsequent updates continue to detect the divergence, and the user's modifications are never silently overwritten. During uninstall (removeFrameworkConfig at line 429), diverged entries produce conflicts and remain in the config. The hash-based ownership model is internally consistent.
Was this helpful? React with 👍 or 👎 to provide feedback.
| } | ||
|
|
||
| process.stdout.write(renderReport(report)) | ||
| return report.items.some((item) => item.status === "conflict/manual action required") ? 2 : 0 |
There was a problem hiding this comment.
📝 Info: CLI exit code 2 for conflicts causes execFileAsync rejection in tests
The runCli function at src/cli.ts:102 returns exit code 2 when conflicts are detected. The test at tests/framework.test.mjs:486-493 relies on execFileAsync rejecting on non-zero exit codes and accessing failure.stdout from the rejection error. This works because Node's child_process.execFile populates stdout and stderr on the error object when the child exits non-zero. This is a deliberate test pattern, not a fragile dependency.
Was this helpful? React with 👍 or 👎 to provide feedback.
| @@ -0,0 +1,303 @@ | |||
| { | |||
| "name": "super-opencode-framework", | |||
| "version": "1.0.1", | |||
There was a problem hiding this comment.
📝 Info: PROJECT_INDEX.json contains hardcoded version 1.0.1
The PROJECT_INDEX.json at line 3 hardcodes "version": "1.0.1" while PROJECT_INDEX.md correctly defers to package.json and CHANGELOG.md for version. The JSON file's version will become stale after the next release bump. This is a documentation concern rather than a functional bug — the JSON file is a navigation aid and not consumed programmatically by the framework.
Was this helpful? React with 👍 or 👎 to provide feedback.
| import { existsSync, readFileSync } from "node:fs" | ||
| import path from "node:path" | ||
| import { fileURLToPath } from "node:url" | ||
|
|
||
| function looksLikePackageRoot(candidate: string): boolean { | ||
| const packageJsonPath = path.join(candidate, "package.json") | ||
| if (!existsSync(packageJsonPath)) { | ||
| return false | ||
| } | ||
|
|
||
| try { | ||
| const packageJson = JSON.parse(readFileSync(packageJsonPath, "utf8")) as { name?: string } | ||
| return packageJson.name === "super-opencode-framework" | ||
| } catch { | ||
| return false | ||
| } | ||
| } | ||
|
|
||
| export function findPackageRoot(fromUrl: string): string { | ||
| let current = path.dirname(fileURLToPath(fromUrl)) | ||
|
|
||
| while (true) { | ||
| if (looksLikePackageRoot(current)) { | ||
| return current | ||
| } | ||
|
|
||
| const parent = path.dirname(current) | ||
| if (parent === current) { | ||
| throw new Error("Unable to resolve the super-opencode-framework package root") | ||
| } | ||
|
|
||
| current = parent | ||
| } | ||
| } |
There was a problem hiding this comment.
📝 Info: Package root discovery uses synchronous fs operations
findPackageRoot at src/framework/package-root.ts:19-33 uses existsSync and readFileSync to walk up the directory tree looking for the package root. This is synchronous I/O which blocks the event loop. It's called from resolveScopePaths (which is called from every engine operation) and from loadFrameworkManifest. Since the manifest is cached after the first load and the path walk is typically only 2-3 directories deep, the practical impact is negligible. But it does mean the first call to any framework function has a synchronous I/O burst.
Was this helpful? React with 👍 or 👎 to provide feedback.
Summary
Validation
Review Request
Summary by CodeRabbit
New Features
Runtime
Documentation
Tests