Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure handle_demand busy wait infinite loop doesn't occur & deprecate message_child #779

Merged
merged 11 commits into from
Mar 20, 2024

Conversation

FelonEkonom
Copy link
Member

Prerequisite for #769
Related to #706

@FelonEkonom FelonEkonom requested a review from mat-hek as a code owner March 18, 2024 15:07
@FelonEkonom FelonEkonom self-assigned this Mar 18, 2024
@FelonEkonom FelonEkonom changed the title Ensure handle_demand busy wait infinite loop doesn't occur Ensure handle_demand busy wait infinite loop doesn't occur, deprecate message_child Mar 18, 2024
@FelonEkonom FelonEkonom changed the title Ensure handle_demand busy wait infinite loop doesn't occur, deprecate message_child Ensure handle_demand busy wait infinite loop doesn't occur & deprecate message_child Mar 18, 2024
@FelonEkonom FelonEkonom marked this pull request as draft March 18, 2024 16:11
@FelonEkonom FelonEkonom marked this pull request as ready for review March 19, 2024 11:14

pipeline = Testing.Pipeline.start_link_supervised!(spec: spec)

Process.sleep(1_000)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can assert_start_of_stream on each sink instead


Process.sleep(1_000)

for _i <- 1..(auto_demand_size + 5) do
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why this number?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To ensure, that this line

DemandHandler.handle_redemand(pad_data.ref, state)

will be executed

Copy link
Member Author

@FelonEkonom FelonEkonom Mar 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now I think, that it would be good to have auto_demand_size < delayed_demand_loop_counter_limit, to ensure, that counter won't reset after reaching linked code. I will change this

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so let’s change the naming or add a comment indicating it’s to exceed the loop counter limit

@FelonEkonom FelonEkonom requested a review from mat-hek March 19, 2024 16:05
Comment on lines 57 to 59
# auto_demand_size is smaller than delayed_demands_loop_counter_limit to ensure, that counter
# doesn't reset after calling handle_demand caused by reading positive atomic demand value
# during the snaphsot
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# auto_demand_size is smaller than delayed_demands_loop_counter_limit to ensure, that counter
# doesn't reset after calling handle_demand caused by reading positive atomic demand value
# during the snaphsot
# auto_demand_size is smaller than delayed_demands_loop_counter_limit, to ensure that
# after calling handle_demand and making a snapshot, the atomic demand value is still positive
# and the counter is not reset

@FelonEkonom FelonEkonom merged commit 0143b79 into master Mar 20, 2024
5 of 6 checks passed
@FelonEkonom FelonEkonom deleted the tests-upgrade branch March 20, 2024 13:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

2 participants