Unanswered question

Is there a way to compare the results of different Iterationss

Hello,

Assume I am running my tests using NeoLoad and applying user load is not a big concern at the moment for me .
I would like to run my test with just a single virtual user.
What I would like to do is design the tests to run in multiple iterations with the same single user.
But in each iteration of the test, the data I use for testing will vary.
Say I am doing an upload test and checking the performance of the system during upload.
The script remains same in each of the iteration
But the data content I upload to the system varies in each iteration.
In the first iteration I would start with a small size amount of data
and in each iteration, the size of upload data would increase.
This is my requirement.

Now with single user and multiple iteration I can do it.
But when the test results and report is generated, I would like to have a way to see how the system behaved, how the performance metrics varied in each of the iteration.
Is there a way to know this ? I guess current reporting do not specifically have a category to display this data
Please let me know

Thanks & Warm Regards
Musaffir

Answers

Sulav B.
Sulav B.

Sulav B.

Level
1
228 / 750
points

As far as i understood what you meant, you can make use of Compare Results/Report feature provided by Neoload. After you execute multiple tests follow these directions to generate a comparison report:

Open the Results Manager in Edit > Results Manager.
In the table, select the two results to compare (single click for the first row, and CTRL+click for the second row)
Click on the Generate report button.
Edit the labels of the two tests and click Next. The default labels are "A" and "B".
Configure the format, the report content and the report file directory, then click OK.

Hi Sulav,
That is not what I meant in my question.
I am not gonna have multiple tests execution of the scenario.
I am gonna have only a single test execution of scenario.

But I am designing my scenario to run for different iterations (more than one ) as the run time configuration allows.

And in the single final report which we can generate for this execution, I am looking for a way to know how things have differed between different iterations of my scenario.
The data I pass in different iterations will differ, so I am keen to know how the performance of my application is differing against those..
Hope it makes sense

Warm Regards
Musaffir