-
Notifications
You must be signed in to change notification settings - Fork 683
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add congestion info script #12457
base: master
Are you sure you want to change the base?
Add congestion info script #12457
Conversation
21ae6c8
to
50e9ad3
Compare
50e9ad3
to
7e89ed0
Compare
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #12457 +/- ##
==========================================
+ Coverage 71.44% 71.46% +0.01%
==========================================
Files 838 838
Lines 169339 169339
Branches 169339 169339
==========================================
+ Hits 120985 121015 +30
+ Misses 43006 42974 -32
- Partials 5348 5350 +2
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
@telezhnaya Would you like me to merge this or shall we wait for the congestion level support first and do it this way? |
I guess my question is whether you would like me to review it now or later :) |
bfbcbbf
to
b424c66
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
overall looks good but there are some minor bugs, please see comments
if not current_block["header"]["height"] % 10: | ||
print(current_block["header"]["height"]) | ||
for chunk in current_block["chunks"]: | ||
shard_info = ShardCongestionInfo(**chunk["congestion_info"]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice
if args.shard_ids and shard_info.allowed_shard not in args.shard_ids: | ||
continue |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's not right, we want to filter by the shard id, not the allowed shard id. I think the shard id should be in the chunk.
"height"]: | ||
if not current_block["header"]["height"] % 10: | ||
print(current_block["header"]["height"]) | ||
for chunk in current_block["chunks"]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you happen to know how this works for missing chunks? Does it just have the old chunk data? Sanity check - is that what we want?
|
||
|
||
@dataclass(frozen=True) | ||
class ShardCongestionInfo: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Aren't we going to use the simpler congestion level?
#[allow(non_snake_case)] | ||
pub fn EXPERIMENTAL_congestion_level( | ||
&self, | ||
request: RpcCongestionLevelRequest, | ||
) -> RpcRequest<RpcCongestionLevelResponse> { | ||
call_method(&self.client, &self.server_addr, "EXPERIMENTAL_congestion_level", request) | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this mean that the congestion level isn't currently available via json rpc?
let block = | ||
client.block(BlockReference::BlockId(BlockId::Height(current_height))).await.unwrap(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about missing blocks?
Add congestion info script. It collects all the data between the given interval, going from the end till start (because we have prev_height and we don't have next_height available)
The interval can be of any length, and the process can be stopped in the middle, all the collected data would be saved.
Attaching example of collected data, please share if this is enough for further investigation.
congestion_data.csv