Add Account_Id to insert statement when inserting a record with an Account_Id.#7972
Conversation
Update Account_ID column if the table has an account_id.
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughcommitAttributeInstancesBulk() now precomputes which correlation instance tables include an Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java (1)
1674-1679: Unused variables and redundant computations.Variables
tableName(line 1676) andisAccount(line 1677) are declared and assigned but never used. The same values are computed again inline on line 1678.♻️ Proposed cleanup
Map<String, Boolean> correlationHasAccount = new HashMap<>(); for (CorrelationAttributeInstance.Type artifactType : artifactTypes) { - String tableName = correlationTypeToInstanceTableName(artifactType); - Boolean isAccount = Boolean.valueOf(correlationAttribHasAnAccount(artifactType)); - correlationHasAccount.put(correlationTypeToInstanceTableName(artifactType), Boolean.valueOf(correlationAttribHasAnAccount(artifactType))); + correlationHasAccount.put( + correlationTypeToInstanceTableName(artifactType), + correlationAttribHasAnAccount(artifactType)); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java` around lines 1674 - 1679, The loop in RdbmsCentralRepo builds correlationHasAccount but declares unused locals tableName and isAccount and recomputes values; fix by removing the redundant variables or using them: compute String tableName = correlationTypeToInstanceTableName(artifactType) once and boolean isAccount = correlationAttribHasAnAccount(artifactType) once, then call correlationHasAccount.put(tableName, Boolean.valueOf(isAccount)) (or simply put(tableName, isAccount) if types allow) so no values are recomputed and no unused variables remain.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java`:
- Around line 1751-1753: The code risks an NPE by calling bulkPs.setLong(9,
eamArtifact.getAccountId()) without null-checking the Long; update the block
that checks correlationHasAccount.get(tableName) to first retrieve Long
accountId = eamArtifact.getAccountId(), then if accountId != null and accountId
>= 0 call bulkPs.setLong(9, accountId) otherwise call bulkPs.setNull(9,
java.sql.Types.BIGINT) (matching the null/negative handling used in
addArtifactInstance); ensure you reference the same PreparedStatement variable
(bulkPs) and parameter index 9 when making the change.
---
Nitpick comments:
In
`@Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java`:
- Around line 1674-1679: The loop in RdbmsCentralRepo builds
correlationHasAccount but declares unused locals tableName and isAccount and
recomputes values; fix by removing the redundant variables or using them:
compute String tableName = correlationTypeToInstanceTableName(artifactType) once
and boolean isAccount = correlationAttribHasAnAccount(artifactType) once, then
call correlationHasAccount.put(tableName, Boolean.valueOf(isAccount)) (or simply
put(tableName, isAccount) if types allow) so no values are recomputed and no
unused variables remain.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: be68330e-0712-4137-bde3-d713aa99faef
📒 Files selected for processing (1)
Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java
updates based on CodeRabbit review
…opulation-in-central-repository
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java`:
- Around line 1674-1679: The lookup of correlationHasAccount can return null
(because bulkArtifacts via addAttributeInstanceBulk can introduce tableName
values not present when correlationHasAccount is built from
getDefinedCorrelationTypes), so replace direct boolean evaluations of
correlationHasAccount.get(tableName) in RdbmsCentralRepo (locations around the
checks after correlationTypeToInstanceTableName/correlationAttribHasAnAccount)
with a null-safe check using
Boolean.TRUE.equals(correlationHasAccount.get(tableName)); update all
occurrences (including the checks that previously assumed a primitive boolean at
the spots corresponding to the logic around bulkArtifacts and
addAttributeInstanceBulk) to prevent NullPointerException.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: a49416dd-9416-401c-997f-bd81aceabb83
📒 Files selected for processing (1)
Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java
address coderabbitai comments
There was a problem hiding this comment.
♻️ Duplicate comments (1)
Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java (1)
1692-1757:⚠️ Potential issue | 🔴 CriticalReuse the null-safe
tableHasAccountdecision.
addAttributeInstanceBulk()can create newbulkArtifactsentries on the fly, so the lookup at Line 1751 is still nullable. Line 1693 already treats a missing entry asfalse, but Line 1751 auto-unboxes the same lookup and can still abort the whole batch. Compute the boolean once per table and reuse it for both SQL construction and parameter binding.🛠️ Proposed fix
- String sql; - if (correlationHasAccount.getOrDefault(tableName, false)) { + final boolean tableHasAccount = correlationHasAccount.getOrDefault(tableName, false); + String sql; + if (tableHasAccount) { sql = "INSERT INTO " + tableName + " (case_id, data_source_id, value, file_path, known_status, comment, file_obj_id, account_id) " + "VALUES ((SELECT id FROM cases WHERE case_uid=? LIMIT 1), " + "(SELECT id FROM data_sources WHERE datasource_obj_id=? AND case_id=? LIMIT 1), ?, ?, ?, ?, ?, ?) " + getConflictClause(); } else { sql = "INSERT INTO " + tableName + " (case_id, data_source_id, value, file_path, known_status, comment, file_obj_id) " + "VALUES ((SELECT id FROM cases WHERE case_uid=? LIMIT 1), " + "(SELECT id FROM data_sources WHERE datasource_obj_id=? AND case_id=? LIMIT 1), ?, ?, ?, ?, ?) " + getConflictClause(); } ... - if (correlationHasAccount.get(tableName)) { + if (tableHasAccount) { if (eamArtifact.getAccountId() != null && eamArtifact.getAccountId() >= 0) { bulkPs.setLong(9, eamArtifact.getAccountId()); } else { bulkPs.setNull(9, Types.INTEGER); } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java` around lines 1692 - 1757, The lookup correlationHasAccount.get(tableName) is auto-unboxed later and can NPE; compute a null-safe boolean once (e.g., boolean tableHasAccount = correlationHasAccount.getOrDefault(tableName, false)) at the start of the addAttributeInstanceBulk() loop and use tableHasAccount both when building the sql string and when binding parameters (replace correlationHasAccount.get(tableName) with tableHasAccount) to avoid nullable lookups.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In
`@Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java`:
- Around line 1692-1757: The lookup correlationHasAccount.get(tableName) is
auto-unboxed later and can NPE; compute a null-safe boolean once (e.g., boolean
tableHasAccount = correlationHasAccount.getOrDefault(tableName, false)) at the
start of the addAttributeInstanceBulk() loop and use tableHasAccount both when
building the sql string and when binding parameters (replace
correlationHasAccount.get(tableName) with tableHasAccount) to avoid nullable
lookups.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: cb5c2ad1-835b-4fb3-80a0-9c275a9a044b
📒 Files selected for processing (1)
Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java
Fixes suggested by codeRabbit
Coderabbit suggested fix
There was a problem hiding this comment.
🧹 Nitpick comments (1)
Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java (1)
1751-1757: Consider usingTypes.BIGINTand caching the account ID.The null-handling logic correctly addresses the NPE risk. Two minor improvements:
- The
account_idcolumn is defined asBIGINTin the schema, soTypes.BIGINTwould be more accurate (thoughTypes.INTEGERworks in practice and matches existing code at line 1062).- Caching
getAccountId()in a local variable avoids calling the getter twice.♻️ Optional refactor
if (tableHasAccount) { - if (eamArtifact.getAccountId() != null && eamArtifact.getAccountId() >= 0) { - bulkPs.setLong(9, eamArtifact.getAccountId()); + Long accountId = eamArtifact.getAccountId(); + if (accountId != null && accountId >= 0) { + bulkPs.setLong(9, accountId); } else { - bulkPs.setNull(9, Types.INTEGER); + bulkPs.setNull(9, Types.BIGINT); } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java` around lines 1751 - 1757, Update the block guarded by tableHasAccount to cache eamArtifact.getAccountId() into a local Long (e.g., accountId) and use that variable for the null-check and set operations; when writing a non-null value call bulkPs.setLong(9, accountId) and when null call bulkPs.setNull(9, Types.BIGINT) so the JDBC type matches the schema and you avoid calling eamArtifact.getAccountId() twice.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In
`@Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java`:
- Around line 1751-1757: Update the block guarded by tableHasAccount to cache
eamArtifact.getAccountId() into a local Long (e.g., accountId) and use that
variable for the null-check and set operations; when writing a non-null value
call bulkPs.setLong(9, accountId) and when null call bulkPs.setNull(9,
Types.BIGINT) so the JDBC type matches the schema and you avoid calling
eamArtifact.getAccountId() twice.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 92f25df1-3f1c-4358-aae6-d8303d97ad0f
📒 Files selected for processing (1)
Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java`:
- Around line 1674-1679: The code is using correlationAttribHasAnAccount() as a
proxy for whether an instance table actually has an account_id column, which can
cause account IDs to be dropped during addAttributeInstanceBulk(); update the
logic in the loop that builds correlationHasAccount so that for each tableName
you either (preferred) always include account_id in the INSERT and bind
eamArtifact.getAccountId() or a SQL NULL when the artifact has no account, or
(alternative) call doesColumnExist(conn, tableName, "account_id") to determine
the actual schema and set correlationHasAccount based on that; ensure
addAttributeInstanceBulk() and the prepared statement generation bind account_id
appropriately when present or bind NULL when absent so account data is not
silently lost (reference symbols: correlationAttribHasAnAccount,
addAttributeInstanceBulk, correlationHasAccount map, doesColumnExist,
eamArtifact.getAccountId(), RdbmsCentralRepoFactory instance table definitions).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 04748871-0779-4bd8-98fe-fda82094b3ee
📒 Files selected for processing (1)
Core/src/org/sleuthkit/autopsy/centralrepository/datamodel/RdbmsCentralRepo.java
Update to use doescolumnexist
|
@markmckinnon What's the context behind this? Wouldn't the DB get upgraded when it is opened and therefore always have the account ID? Are there paths where the upgrade isn't forced? |
|
@bcarrier the database had account_id but the column never had any data in it since 4.19.x. The column will have empty values for anything that was inserted since 4.19.x for tables with account_id. This fixes any new data that is inserted going forward. |
Update Account_ID column if the table has an account_id. when the CR record is inserted. This has not been done since 4.19.x version.
Summary by CodeRabbit