Let's take a look at how LoadNinja's databanks simplify the process. In this case, by extracting data from TXT or CSV files and dynamically inserting data from them into load tests.
Load tests should follow the same principles as unit tests when it comes to dynamic test data.
What Are Databanks?
Load testing has become an integral part of the development process. According to Gartner, web application downtime costs companies an average of $5,600 per minute and up to $300,000 per hour. Load tests help ensure that new code doesn't introduce any performance bugs that could lead to significant slowdowns or unexpected application crashes.
Many load tests use hard-coded parameter values that are specified when recording or editing a script. For example, you may load test a login process using just one set of credentials. If two test cases are required, test engineers might simply re-record the script using a different login process and run an entirely separate load test.
Data-driven load testing, or DDT, is the process of using dynamic parameters, known as "databanks," in LoadNinja, in place of hard-coded values. Databanks eliminate the need to create separate tests for each piece of input data, reduce the time it takes to write tests, and ensure a more robust test suite that includes all relevant test cases.
Another advantage of data banks is that they can be prepared by stakeholders and business analysts outside of the load testing process. For example, business analysts may look at production logs and come up with a list of anonymized email addresses (correct and incorrect) and provide them as a TXT or CSV file to test engineers to incorporate into the load test.
How Do They Work?
Most data-driven load tests read dynamic parameters from static files or other data sources. In LoadNinja, databanks are specially formatted CSV or TXT files that contain data delimited with tabs or commas. The platform maps the databank columns to event parameters in a random, sequential, or unique way depending on your preferences.
LoadNinja lets you attach up to four databanks to each individual test script, enabling you to use dynamic parameters across several areas. For instance, you may have a sign-up form and want to use different databanks for authentication credentials (e.g. username and password) and contact details (e.g. name, address, phone and zip code).
Before using databanks, it's important to examine why you may need them and how they should be built. Not every input field needs to be dynamic to accurately model performance. It's important to design dynamic inputs that mimic actual user behaviors and explore different use cases. For instance, you may want a percentage of them to throw specific errors.
Some important questions to ask yourself include:
- Does the input data differ considerably in size? (e.g. Contact list uploads that differ in size)
- Does the input data result in different processing times? (e.g. Converting large avatar images to thumbnails)
- Does incorrectly formatted input data impact performance? (e.g. Do errors take longer to process than success)
In many cases, it makes sense to have these conversations with team members that have insights into customer behavior. Stakeholders or business analysts may have a better understanding of the types of data that’s likely to be input by users, as well as the variability of the data. While developers could provide insights into performance impact.
Example Use Cases
There are many different use cases for data-driven load tests. In addition to authentication, you may want to use dynamic parameters for form submissions, computational tasks, API-driven requests and other areas where the value influences the loading time.
Let's take a look at an example of using data-driven load tests for an authentication load test.
Start by creating a CSV file containing different username and password combinations to use for load tests:
The next step is recording a script that involves logging in using a username and password. You can do this by going to the Projects > (Select Your Project) > Web Tests.
You can add a databank to the test using the following steps:
- Click on Add > Databank in the toolbar.
- Click Browse and select one or more databank files.
- Click the Next: Define File Format button.
- Specify how to treat the first row and the delimiter.
- Click the Next: Review Import button.
- Review the data preview and click the Back button if there are any issues with how it looks.
- Click the Next: Map Data to Script button to proceed.
- Match the appropriate inputs to the Map File and Map Column, and confirm using the Preview.
- Click the Next: Save Mapping button to finish.
You can adjust how the databank rows are selected by clicking on the Settings icon and changing the Run Databank Rows dropdown to match your preferences.
The three options include:
- Random – LoadNinja picks random data rows for each iteration and each virtual user, which means that they may be duplicated in some instances.
- Sequential – LoadNinja will pick data sequentially for each iteration in each virtual user. However, multiple CUs may be using the same data at the same time.
- Unique – LoadNinja will pick unique data rows for virtual users such that they will never use the same data at the same time.
When using the Unique option, it's important to note that the number of databank rows should be equal to or greater than the number of virtual users that will run the script.
The Bottom Line
Data-driven load tests simplify load testing by eliminating redundancy, cutting down on test writing time and improving the accuracy of the load test suite. LoadNinja makes it easy to implement data-driven load tests via its databank functionality, which automatically loads test data from CSV or TXT files into any load test.
LoadNinja also simplifies other areas of the load testing process. With its record and playback functionality, you can easily record load tests within a browser without any scripting or dynamic correlation. You can then generate realistic loads using tens of thousands of real browsers that accurately represent your end-users.