-
Notifications
You must be signed in to change notification settings - Fork 10
refactor: storage manager trait splitted into multiple subtraits #311
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
21 commits
Select commit
Hold shift + click to select a range
d31da81
traits created
ZocoLini 3ee9369
the disk storage manager worker is now a time based check, removed ol…
ZocoLini b80fb21
tests updated
ZocoLini 27f7da2
replaced header_at_height
ZocoLini 8a34963
removed unused methods
ZocoLini 861f63d
init_from_checkpoint sync
ZocoLini f429763
removed two methos that where invovled in the same process
ZocoLini eb3487b
fixed clippy warnings
ZocoLini 31f9938
dropped unuseed code
ZocoLini f60fc00
everything moved where I want it to be
ZocoLini 739e809
general structure made
ZocoLini bd33f0a
persist segments caches now requires the directory where the user wan…
ZocoLini c3166cf
using rwlock to allow segmentcache mutability behind inmutable ref
ZocoLini db519b1
clear method fixed
ZocoLini e1924ef
default method implementations in storage traits
ZocoLini b8850b2
storage manager trait implemented
ZocoLini b5fedeb
fixed code to pass the tests
ZocoLini 99c086a
storage documentation updated
ZocoLini 0248257
rebase conflicts resolved
ZocoLini 81e3dec
masternodestate storage was not being persisted following the pattern…
ZocoLini 79cf7a2
replaced write() locks where a read() can be used
ZocoLini File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,180 @@ | ||
| //! Header storage operations for DiskStorageManager. | ||
|
|
||
| use std::collections::HashMap; | ||
| use std::ops::Range; | ||
| use std::path::PathBuf; | ||
|
|
||
| use async_trait::async_trait; | ||
| use dashcore::block::Header as BlockHeader; | ||
| use dashcore::BlockHash; | ||
| use tokio::sync::RwLock; | ||
|
|
||
| use crate::error::StorageResult; | ||
| use crate::storage::io::atomic_write; | ||
| use crate::storage::segments::SegmentCache; | ||
| use crate::storage::PersistentStorage; | ||
| use crate::StorageError; | ||
|
|
||
| #[async_trait] | ||
| pub trait BlockHeaderStorage { | ||
| async fn store_headers(&mut self, headers: &[BlockHeader]) -> StorageResult<()>; | ||
|
|
||
| async fn store_headers_at_height( | ||
| &mut self, | ||
| headers: &[BlockHeader], | ||
| height: u32, | ||
| ) -> StorageResult<()>; | ||
|
|
||
| async fn load_headers(&self, range: Range<u32>) -> StorageResult<Vec<BlockHeader>>; | ||
|
|
||
| async fn get_header(&self, height: u32) -> StorageResult<Option<BlockHeader>> { | ||
| if let Some(tip_height) = self.get_tip_height().await { | ||
| if height > tip_height { | ||
| return Ok(None); | ||
| } | ||
| } else { | ||
| return Ok(None); | ||
| } | ||
|
|
||
| if let Some(start_height) = self.get_start_height().await { | ||
| if height < start_height { | ||
| return Ok(None); | ||
| } | ||
| } else { | ||
| return Ok(None); | ||
| } | ||
|
|
||
| Ok(self.load_headers(height..height + 1).await?.first().copied()) | ||
| } | ||
|
|
||
| async fn get_tip_height(&self) -> Option<u32>; | ||
|
|
||
| async fn get_start_height(&self) -> Option<u32>; | ||
|
|
||
| async fn get_stored_headers_len(&self) -> u32; | ||
|
|
||
| async fn get_header_height_by_hash( | ||
| &self, | ||
| hash: &dashcore::BlockHash, | ||
| ) -> StorageResult<Option<u32>>; | ||
| } | ||
|
|
||
| pub struct PersistentBlockHeaderStorage { | ||
| block_headers: RwLock<SegmentCache<BlockHeader>>, | ||
| header_hash_index: HashMap<BlockHash, u32>, | ||
| } | ||
|
|
||
| impl PersistentBlockHeaderStorage { | ||
| const FOLDER_NAME: &str = "block_headers"; | ||
| const INDEX_FILE_NAME: &str = "index.dat"; | ||
| } | ||
|
|
||
| #[async_trait] | ||
| impl PersistentStorage for PersistentBlockHeaderStorage { | ||
| async fn open(storage_path: impl Into<PathBuf> + Send) -> StorageResult<Self> { | ||
| let storage_path = storage_path.into(); | ||
| let segments_folder = storage_path.join(Self::FOLDER_NAME); | ||
|
|
||
| let index_path = segments_folder.join(Self::INDEX_FILE_NAME); | ||
|
|
||
| let mut block_headers = SegmentCache::load_or_new(&segments_folder).await?; | ||
|
|
||
| let header_hash_index = match tokio::fs::read(&index_path) | ||
| .await | ||
| .ok() | ||
| .and_then(|content| bincode::deserialize(&content).ok()) | ||
| { | ||
| Some(index) => index, | ||
| _ => { | ||
| if segments_folder.exists() { | ||
| block_headers.build_block_index_from_segments().await? | ||
| } else { | ||
| HashMap::new() | ||
| } | ||
| } | ||
| }; | ||
|
|
||
| Ok(Self { | ||
| block_headers: RwLock::new(block_headers), | ||
| header_hash_index, | ||
| }) | ||
| } | ||
|
|
||
| async fn persist(&mut self, storage_path: impl Into<PathBuf> + Send) -> StorageResult<()> { | ||
| let block_headers_folder = storage_path.into().join(Self::FOLDER_NAME); | ||
| let index_path = block_headers_folder.join(Self::INDEX_FILE_NAME); | ||
|
|
||
| tokio::fs::create_dir_all(&block_headers_folder).await?; | ||
|
|
||
| self.block_headers.write().await.persist(&block_headers_folder).await; | ||
|
|
||
| let data = bincode::serialize(&self.header_hash_index) | ||
| .map_err(|e| StorageError::WriteFailed(format!("Failed to serialize index: {}", e)))?; | ||
|
|
||
| atomic_write(&index_path, &data).await | ||
| } | ||
| } | ||
|
|
||
| #[async_trait] | ||
| impl BlockHeaderStorage for PersistentBlockHeaderStorage { | ||
| async fn store_headers(&mut self, headers: &[BlockHeader]) -> StorageResult<()> { | ||
| let height = self.block_headers.read().await.next_height(); | ||
| self.store_headers_at_height(headers, height).await | ||
| } | ||
|
|
||
| async fn store_headers_at_height( | ||
| &mut self, | ||
| headers: &[BlockHeader], | ||
| height: u32, | ||
| ) -> StorageResult<()> { | ||
| let mut height = height; | ||
|
|
||
| let hashes = headers.iter().map(|header| header.block_hash()).collect::<Vec<_>>(); | ||
|
|
||
| self.block_headers.write().await.store_items_at_height(headers, height).await?; | ||
|
|
||
| for hash in hashes { | ||
| self.header_hash_index.insert(hash, height); | ||
| height += 1; | ||
| } | ||
|
|
||
| Ok(()) | ||
| } | ||
|
|
||
| async fn load_headers(&self, range: Range<u32>) -> StorageResult<Vec<BlockHeader>> { | ||
| self.block_headers.write().await.get_items(range).await | ||
| } | ||
|
|
||
| async fn get_tip_height(&self) -> Option<u32> { | ||
| self.block_headers.read().await.tip_height() | ||
| } | ||
|
|
||
| async fn get_start_height(&self) -> Option<u32> { | ||
| self.block_headers.read().await.start_height() | ||
| } | ||
|
|
||
| async fn get_stored_headers_len(&self) -> u32 { | ||
| let block_headers = self.block_headers.read().await; | ||
|
|
||
| let start_height = if let Some(start_height) = block_headers.start_height() { | ||
| start_height | ||
| } else { | ||
| return 0; | ||
| }; | ||
|
|
||
| let end_height = if let Some(end_height) = block_headers.tip_height() { | ||
| end_height | ||
| } else { | ||
| return 0; | ||
| }; | ||
|
|
||
| end_height - start_height + 1 | ||
| } | ||
|
|
||
| async fn get_header_height_by_hash( | ||
| &self, | ||
| hash: &dashcore::BlockHash, | ||
| ) -> StorageResult<Option<u32>> { | ||
| Ok(self.header_hash_index.get(hash).copied()) | ||
| } | ||
| } |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,101 @@ | ||
| use std::path::PathBuf; | ||
|
|
||
| use async_trait::async_trait; | ||
|
|
||
| use crate::{ | ||
| error::StorageResult, | ||
| storage::{io::atomic_write, PersistentStorage}, | ||
| ChainState, | ||
| }; | ||
|
|
||
| #[async_trait] | ||
| pub trait ChainStateStorage { | ||
| async fn store_chain_state(&mut self, state: &ChainState) -> StorageResult<()>; | ||
|
|
||
| async fn load_chain_state(&self) -> StorageResult<Option<ChainState>>; | ||
| } | ||
|
|
||
| pub struct PersistentChainStateStorage { | ||
| storage_path: PathBuf, | ||
| } | ||
|
|
||
| impl PersistentChainStateStorage { | ||
| const FOLDER_NAME: &str = "chainstate"; | ||
| const FILE_NAME: &str = "chainstate.json"; | ||
| } | ||
|
|
||
| #[async_trait] | ||
| impl PersistentStorage for PersistentChainStateStorage { | ||
| async fn open(storage_path: impl Into<PathBuf> + Send) -> StorageResult<Self> { | ||
| Ok(PersistentChainStateStorage { | ||
| storage_path: storage_path.into(), | ||
| }) | ||
| } | ||
|
|
||
| async fn persist(&mut self, _storage_path: impl Into<PathBuf> + Send) -> StorageResult<()> { | ||
| // Current implementation persists data everytime data is stored | ||
| Ok(()) | ||
| } | ||
| } | ||
|
|
||
| #[async_trait] | ||
| impl ChainStateStorage for PersistentChainStateStorage { | ||
| async fn store_chain_state(&mut self, state: &ChainState) -> StorageResult<()> { | ||
| let state_data = serde_json::json!({ | ||
| "last_chainlock_height": state.last_chainlock_height, | ||
| "last_chainlock_hash": state.last_chainlock_hash, | ||
| "current_filter_tip": state.current_filter_tip, | ||
| "last_masternode_diff_height": state.last_masternode_diff_height, | ||
| "sync_base_height": state.sync_base_height, | ||
| }); | ||
|
|
||
| let chainstate_folder = self.storage_path.join(Self::FOLDER_NAME); | ||
| let path = chainstate_folder.join(Self::FILE_NAME); | ||
|
|
||
| tokio::fs::create_dir_all(chainstate_folder).await?; | ||
|
|
||
| let json = state_data.to_string(); | ||
| atomic_write(&path, json.as_bytes()).await?; | ||
|
|
||
| Ok(()) | ||
| } | ||
|
|
||
| async fn load_chain_state(&self) -> StorageResult<Option<ChainState>> { | ||
| let path = self.storage_path.join(Self::FOLDER_NAME).join(Self::FILE_NAME); | ||
| if !path.exists() { | ||
| return Ok(None); | ||
| } | ||
|
|
||
| let content = tokio::fs::read_to_string(path).await?; | ||
| let value: serde_json::Value = serde_json::from_str(&content).map_err(|e| { | ||
| crate::error::StorageError::Serialization(format!("Failed to parse chain state: {}", e)) | ||
| })?; | ||
|
|
||
| let state = ChainState { | ||
| last_chainlock_height: value | ||
| .get("last_chainlock_height") | ||
| .and_then(|v| v.as_u64()) | ||
| .map(|h| h as u32), | ||
| last_chainlock_hash: value | ||
| .get("last_chainlock_hash") | ||
| .and_then(|v| v.as_str()) | ||
| .and_then(|s| s.parse().ok()), | ||
| current_filter_tip: value | ||
| .get("current_filter_tip") | ||
| .and_then(|v| v.as_str()) | ||
| .and_then(|s| s.parse().ok()), | ||
| masternode_engine: None, | ||
| last_masternode_diff_height: value | ||
| .get("last_masternode_diff_height") | ||
| .and_then(|v| v.as_u64()) | ||
| .map(|h| h as u32), | ||
| sync_base_height: value | ||
| .get("sync_base_height") | ||
| .and_then(|v| v.as_u64()) | ||
| .map(|h| h as u32) | ||
| .unwrap_or(0), | ||
| }; | ||
|
|
||
| Ok(Some(state)) | ||
| } | ||
| } | ||
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.