Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming tests added #139

Merged
merged 95 commits into from
Nov 11, 2024
Merged

Streaming tests added #139

merged 95 commits into from
Nov 11, 2024

Conversation

Adir111
Copy link
Contributor

@Adir111 Adir111 commented Oct 27, 2024

Related to this PR: kube-HPC/hkube#2009
which is related to this issue: kube-HPC/hkube#2002
Added end to end tests for new streaming scaling logic.
Warning - Do not merge until the PR is changed.

@@ -7,7 +7,7 @@ const alg = {
},
"name": "statefull-time-statistics-tst",
"entryPoint": "statefullGetSendTime.py",
"cpu": 0.03,
"cpu": 0.1,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So little cpu might cause problems producing messages

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's true, but only when we expect high message rates.
In the tests where we expect this, I’ve replaced the stateful CPU with the createAlg method:

const createAlg = async (alg, cpu) => {
        await deleteAlgorithm(alg.name, true, true)
        if (cpu) {
            alg.cpu = cpu;
        }
        await storeAlgorithms(alg);
    }

Would you prefer we increase the CPU allocation for every test's stateful configuration? As it stands, it is only modified when needed, with 0.1 being the default value.

"kind": "stream",
"flowInput": {
"process_time": 0.02,
"flows": [
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both flows have the save rate. The idea was to test different rates from each node

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's a template, in the test itself the program is being initialised:


it("should satisfy the request rate of 2 statefuls, each with different rate", async () => {
await createAlg(statefull);
algList.push(statefull.name);
await createAlg(stateless);
algList.push(stateless.name);

const flow1Config = {
  programs: [
      { rate: 120, time: 50 }
  ]
};
const flow2Config = {
  flowName: "hkube_desc2",
  programs: [
      { rate: 60, time: 50 }
  ]
};
streamDifferentFlows.flowInput = combineFlows([flow1Config, flow2Config]);
...

Updated the template itself for clarity.

@Adir111 Adir111 merged commit 48b856a into master Nov 11, 2024
1 check passed
@Adir111 Adir111 deleted the streaming_tests_added branch November 11, 2024 06:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants