Scripting an API load test

Tool selection

The most important consideration for choosing a load testing tool to use is of course whether they support the protocols that you want to use. Luckily, API protocols tend to be supported by most tools due to their popularity, so this will be less of an issue. The second most important consideration is what resources you have available.

By resources, I mean the technical expertise and experience of the people on the testing team. Would your team be more comfortable with a nice UI, or do you feel at home coding away in a text editor or IDE? How much time do you have available to learn any tools that may be new to you?

A short proof of concept for the most promising tool should be carried out before building an extensive suite of scripts, and this should entail scripting a small sample of the requests. A proof of concept will show whether the tool is appropriate for the job.

Let me run through some of the most popular API load testing tools, starting from the tools that require less technical experience and going to those that require more.

Test Builder

A good place to start for those with no coding experience is a tool with a user interface. I’m going to use Flood’s own Test Builder as an example, just because that’s the one I’ve had most experience with.

The Test Builder is an interface built on top of JMeter that gives less experienced load testers a way to get started quickly. There’s nothing to install, there’s no code to write, and the tool actually creates a JMeter script for you in the background, so you won’t get stuck with a script that can’t be used anywhere else.

To start, you enter the domain you want to test (in this example, I’ve used and then add a step for each request that you’d like to send.

An advantage of this method is that running through Flood means you don’t have to worry about provisioning load generators or monitoring them, because Flood does that for you. It will also give you results for your test and is a great way to get up and running in a few minutes without much preparation.

One disadvantage is that it’s not free— Flood is a paid service with a free trial, so if you’re looking to go completely open-source, this isn’t for you. Another thing to keep in mind is that the Test Builder is meant for simpler scripts; you don’t get as much control over parameterisation or think time like you would for other tools.

Here’s a tutorial on how to use Test Builder with Flood.


JMeter is an extremely popular open-source load testing tool, and for good reason. It’s got a solid history of being able to deliver results and is an industry standard.

To start with JMeter load testing, you’ll need to make sure you’ve downloaded Java as well as JMeter itself. Unlike the Test Builder, where you did your scripting in a web interface, you’ll be doing your scripting within the JMeter program.

One big reason to use JMeter is its UI. While not the best looking, it is relatively easy to use. It’s a step above the Test Builder in complexity, because there’s still a learning curve here, but it’s still a step below other tools where scripts are written entirely in code.

You can use JMeter to create robust load testing scripts without coding experience, but your JMeter load testing script can also include code in the form of Beanshell or Groovy post- and pre-processors. This flexibility means you can do a lot more with JMeter than you can with Test Builder alone.

Because JMeter is so popular, it’s very easy to get support online. Regardless of what you’re trying to do in JMeter, typing “JMeter load testing” in the search engine is likely to return a lot of results for you to learn from. It is also very well-supported by the community: the JMeter project on GitHub boasts 15,722 commits in the 20 years it’s been around, and the number of custom plugins for JMeter load testing seems to increase every day.


Gatling is another big player in the load testing tool space. Gatling load testing may not be as popular as JMeter load testing, but it definitely has its own share of avid fans, especially among developers.

The most divisive characteristic of Gatling is that it has no UI. Instead, you’ll write your Gatling load testing scripts in a text editor using pure code like this:

This may turn off some people, but others may be charmed by Gatling’s easy, simple approach to load testing. Gatling is written in Scala, which is a relatively user-friendly language, and it’s a lot easier to hook up to continuous integration tools rather than other tools that require an interface to edit a script.

Gatling does have a smaller community behind it, so while there is still quite a bit of training material and plugins for Gatling, it can’t match the information that’s out there for JMeter.

One of the big advantages to Gatling load testing is that it is more efficient than JMeter for large-scale load tests. Unlike JMeter, which starts up a thread for every virtual user that needs to be run, Gatling uses a different structure that allows it to run more than one user per thread, minimising the total number of threads used. If you’re looking at a test that you hope to scale up to more than 100,000 users, you should consider using Gatling.

Other tools

There are a lot of other commercial tools out there. In this book, I’m going to focus more on JMeter and Gatling because they are open-source and free, aside from being extremely powerful load testing tools in their own right. However, if you have a bit more of a budget or possibly already have other tool licenses, here’s a round-up of other good load testing tools:

Initially created by Mercury, sold to HP and now owned by MicroFocus, LoadRunner has been a load testing staple for decades. While it is notoriously expensive, LoadRunner does have excellent support for a huge swathe of protocols, and it does most things very well. It also comes with the benefit of being able to hook up to the extremely popular Application Lifecycle Management suite.

Neotys NeoLoad is another great commercial alternative, and one that happens to be really good for new load testers. I have a soft spot for NeoLoad as it was the first load testing tool that I scripted with. Its drag-and-drop interface is somewhat similar to JMeter, but the whole package is just generally easy to use.

MicroFocus’ Silk Performer has also been a contender in recent years, and its customer support has been stellar for me. However, its future is unclear now that MicroFocus has bought the load testing darling, LoadRunner.

JMeter scripting

In this section you’ll learn how to get started with JMeter load testing. We’ll go through the steps for creating your first basic JMeter load testing script here, but you can also check your work against this sample script.

First, download and install JMeter as well as Java, which it requires. Then go to the /bin directory where you installed JMeter and double-click either jmeter.bat (Windows) or just jmeter (Linux). The JMeter window will open with your very first project, and the only thing you’ll see is an empty Test Plan.

The most basic JMeter load testing script will require these elements:

- Thread Group

- Transaction Controller

- HTTP Request Sampler

- View Results Tree Listener

- Uniform Random Timer

There are alternatives within JMeter for each of these, but for simplicity’s sake, I’ve included the most basic versions.

A Thread Group in JMeter-speak is a set of instructions that you want each user to follow. To add it, right click on Test Plan > Add > Threads > Thread Group. You should get a child element under Test Plan that looks like this when you click on it:

These are the default values for a Thread Group. Some of these values need to be changed, which we’ll cover later. For now, let’s focus on getting a single user to hit one endpoint.

A Transaction Controller is a way to organise a group of requests that logically belong together. This has reporting ramifications. For example, you might have 10 requests for resources for the same page. Rather than reporting on each of those separately, it might make more sense to report the total response time for the entire page. To add it, right click on Thread Group > Add > Logic Controller > Transaction Controller.

The only setting I would recommend changing is “Generate parent sample”. The Transaction Controller comes with this unticked by default, but I would tick it so that JMeter reports on the metrics for the transaction, not the individual requests.

The HTTP Request Sampler is the heart of our basic test plan. This is where JMeter will actually send the request to your API. To add it, right click on the Transaction Controller > Add > Sampler > HTTP Request. A “sample” is a request that has been sent, and a “sampler” is what JMeter calls these elements that allow you to build requests.

Click on the the HTTP Request Sampler and fill in the following fields:

In this example I’m just sending a GET request to

To view results while we’re running the test, we’ll first need to add a listener. A listener captures requests sent, as well as their corresponding responses, for us to view later. There are quite a few listeners, but the most basic one that you’ll need is the View Results Tree Listener. It records all the details of the request and response pair and is best for debugging. Go ahead and right click on Test Plan > Add > Listener > View Results Tree.

It will look pretty empty until we run a test, so let’s do that now. On the JMeter toolbar, you’ll see two buttons that look like a play button.

Click on the first one, the solid green play button. Now click on the View Results Tree listener. It may take a few seconds to get a response, but then your listener should look like this:

You’ll see that our request, “HTTP Request”, under the Transaction Controller, has been sent. By default, we’re on the Sampler result tab, which will show us some quick metrics about our request. Clicking on the Request and Response data tabs will allow us to see the raw request and response for that request as well.

The last basic element is the timer. For this tutorial, I’ve chosen the Uniform Random Timer because it’s easy to understand. A timer in JMeter is an artificially introduced delay. These delays space out the requests to make them more realistic— more on that later. For now, go ahead and add the timer by right clicking on Test Plan > Add > Timer > Uniform Random Timer.

Enter values in the Thread Delay Properties. A good place to start is by having a Constant Delay Offset of 10,000 ms and a Random Delay Maximum of 1,000 ms. This means that the delay will be at least 10,000 ms long, plus a variable amount of up ot 1,000 ms. JMeter automatically removes timers when calculating the response time.

Now let’s click on that second green play button on the toolbar.

The first green play button runs the script including the timers, and the second one runs the script without any timers. For the most part, while we’re scripting and running just one user, we’ll want to use the second play button to save a little bit of time. Go ahead and play around with those now. Regardless of which one you use, the response time JMeter reports in the View Results Tree Listener will be the actual response time of the request, not including the timer delays.

Another thing to note is that timers are always applied before the requests, not after them.

If you’ve been following along, your test plan should look like this:

One final note about elements in JMeter before we move on to Gatling: where you put the element matters. For example, since we put the timer as a child of the Test Plan, we’re applying the timer to all samplers in that context. If we wanted to apply it only to one HTTP request, we’d need to put it as a child element of that HTTP Request Sampler.

Gatling scripting

First, download Gatling. You’ll want to get the standalone tool for this basic tutorial. You’ll also need Java to run Gatling.

Unzip Gatling. You’ll see that it comes with a few directories:

bin contains Gatling’s execution engine and a recorder that will help you create scripts

conf contains files you’ll modify to change the default Gatling configuration settings

lib contains extra files you may require to run your simulations

results will contain data from previous test runs

target will contain simulations that you’ve successfully compiled

user-files contains your simulations and scala files

Gatling comes with a sample script by default, which you can find in /user-files/simulations/computerdatabase/BasicSimulation.scala, but let’s take a step back and create our own Gatling simulation to run. Create a folder in /user-files/simulations called gatlingsample and create a file within that folder called BasicSimulation.scala. You can download a copy of this script to follow along.

Here’s the most basic version of the script:

package gatlingsample
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import scala.concurrent.duration._
class SampleSimulation extends Simulation {
  val httpProtocol = http.baseUrl("") // This sets your base URL for all subsequent requests
  val scn = scenario("Basic") // A scenario is a chain of requests and pauses
     .exec(http("01_Home") // This sets the transaction name
       .get("/")) // method and relative path to retrieve
 .check(, substring("Smooth Scaling").exists) // Verify that the HTTP code returned is 200 and that the substring exists
     .pause(7) // think time

This script contains one transaction called “01_Home” with the scenario “Basic” that will start one user and make a single GET request for It will then look at the data returned, check for the text “Smooth Scaling”, and then pause for 7 seconds before finishing.

To run this script, open up your terminal. Unlike JMeter, Gatling doesn’t have a UI, so you’ll need to get comfortable with the command line in order to use it.

Change directory to Gatling’s bin folder and type ./ (Unix) or ./gatling.bat (Windows). Gatling will compile all simulations it finds. Once that’s done, it’ll then ask you which one you’d like to run.

Type 6 and hit the ENTER key. Enter an optional test description and hit ENTER, and Gatling will run the simulation. After the test has finished, you’ll see a cursory report:

If you copy that URL into your browser, you’ll see Gatling’s standard HTML report, which you can use to find out more detailed information about your test run:

Each run will generate a new report.

That script only makes one request, however, so we’ll need to instruct Gatling to iterate. In addition, the scenario so far only runs one user. Let’s make some adjustments:

package gatlingsample
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import scala.concurrent.duration._
class SampleSimulation extends Simulation {
  val threads   = 10
  val rampup    = 30
  val duration  = 300
  val httpProtocol = http.baseUrl("") // This sets your base URL for all subsequent requests
  val scn = scenario("Basic") // A scenario is a chain of requests and pauses
  .during(duration seconds) {
   exec(http("01_Home") // This sets the transaction name
       .get("/") // method and relative path to retrieve
       .check(, substring("Smooth Scaling").exists) // Verify that the HTTP code returned is 200 and that the substring exists
 .pause(7) // think time
  setUp(scn.inject(rampUsers(threads) during (rampup seconds))).protocols(httpProtocol)

This wraps everything in the scenario in a during {} loop, and you may notice that I’ve also changed the setUp scenario line so that it ramps up users to a certain number and then maintains that amount of users for the whole duration. In this script, I’ve hardcoded values for those parameters, but if you’re running this script on Flood, you can do this to have the script take the values from the Flood UI:

// Optional, Tricentis Flood will pass in threads, rampup and duration properties from UI
  val threads   = Integer.getInteger("threads",  1000)
  val rampup    = Integer.getInteger("rampup",   60).toLong
  val duration  = Integer.getInteger("duration", 300).toLong

Download the final script here.

Making scripts more realistic

At this point, you should already have some load testing scripts that have been functionally shaken out. This means that you have executed the script with at least one user locally and verified on the backend that you’re hitting the right components without errors. For now, your script is just an automated way of sending requests. What makes it a load test?

Parameterizing environments

If you have several environments that you might like to test with the same script, it would be a good idea to set up your script so that it’s easy to switch between environments.


In JMeter, you can accomplish this by creating a variable for your environment. Right click on Test Plan > Add > Config Element > User Defined Variables. Then, click Add and then set the name and value for your variable.

You’ll want the value of the variable to be the URL of the environment you currently want to test.

Then, create an HTTP Request Defaults config element (right click on Test Plan > Add > Config Element > HTTP Request Defaults). In this config element, you’ll add ${environment} as the URL. That way, any requests with a blank field for the server name will take the value of the variable you created in User Defined Variables as the default domain.

The last step is to go ahead and delete the domain name of all HTTP requests so that the variable in the default is used.

This way, when you want to change the domain name you’re testing, you only need to change it in one place: User Defined Variables.


In Gatling, this is done by setting the base URL:

val httpProtocol = http.baseUrl("")

Then, succeeding requests will only need to have the path:

exec(http("01_Home") // This sets the transaction name

.get("/") // method and relative path to retrieve

To change the environment, all you’ll need to change is the base URL.

Increasing throughput

The most obvious way to turn a script into a load testing script is to increase throughput. This can actually be done in a few different ways.

Number of users or threads

This is the most obvious way to increase load. The more instances of your script running, the more requests are executed.

You can change this in JMeter by clicking on your Thread Group and changing the “Number of Threads (users)” field:

In Gatling, you can use the line setUp(scn.inject(rampUsers(threads) during (rampup seconds))).protocols(httpProtocol) in your script and replace the value of threads with the number that you’d like to run with.

Think time

Not including think time is a common mistake in load testing scripts. Think time is the amount of time that a user spends “thinking”— that is, the delay between requests. Firing off many requests immediately one after another may sound like a great way to apply some extra load on your server, but this can actually backfire. Not including think time in your scripts is resource-intensive because it causes the node you’re running the script from to work overtime to send the requests. Sometimes this can cause delays within the node as it struggles to process each request before sending it to the server. This means that the node itself can become a bottleneck, causing some queuing of requests way before your requests are even sent to your servers. This would report high response times that don’t necessarily reflect how your application servers handle the load.

To prevent this from happening, add think time to your scripts that reflects a user’s normal wait time. This will space out the requests and lead to more accurate results.

However, it’s important to note that decreasing think time will also increase throughput, because each thread will be able to send more requests if the time it has to wait decreases. The right balance of think time needs to be struck in order to mimic production behaviour.

In JMeter, this is done by using one of the many built in timers. An example of this can be found in the section for writing a basic JMeter script, but you can really use any of the timers in JMeter to suit your needs. Pay special attention to the context that the timer sits in: JMeter applies the timer to all elements at that level, so you only need to add one timer once. Run the script with pauses to verify that the think time is being applied as you expect.

You can simulate think time in Gatling using .pause(1000 milliseconds, 3000 milliseconds) after the request you want to pause after. You can change the numbers to fit your scenario.

Concurrent requests

By default, requests are executed sequentially, as they are written in the script. By contrast, concurrent requests are executed in parallel— they are sent at the same time. In the same way that decreasing think time increases how many requests a user can make in a certain amount of time, using concurrent requests also increases throughput because the requests are made in batches.

Web browsers actually send some requests concurrently, such as embedded resources on a page. Determine how API calls are made in production: are they usually called one after another (sequential) or made at the same time (concurrent)? This will determine whether you incorporate concurrent requests in your script.

This can be accomplished in JMeter by using the Parallel Controller instead of the Transaction Controller. Right click on the Thread Group > Add > Logic Controller > Parallel Controller:

All requests you put underneath this parallel controller will then be executed concurrently.

Gatling uses .resources to accomplish the same thing. Here’s an example from the Gatling Documentation:

http("Getting issues")






It’s meant to be used to simulate the download of resources that are embedded into a page, but it will work for any request that you put in as a resource.

Setting test parameters

Test parameters are the main characteristics of the test and include the number of users, ramp up and duration.

We already discussed the number of users in the previous section, but they also belong here as a key characteristic of the test.

Ramp up

In production, load on servers very rarely goes from 0 to 1000 in one second. Even for cases with a very definitive start time (such as an item going on sale at a particular time), the load generally increases gradually. This gradual increase can be simulated by adding ramp up times in your script. A ramp up is the amount of time during which new users are added at staggered intervals.

Here’s an example of what that might look like:


Duration is how long the whole test actually lasts.

The simplest way to set these values in JMeter is to use the basic Thread Group. You’ll need to change the fields highlighted below:

Note that in order to use the duration, you’ll have to select “Forever” next to Loop Count and “Scheduler” as well as type in the number of seconds in the Duration field. Otherwise, you can also run the script according to the number of loops, or iterations, that you want executed. However, since the response times will vary, using the Loop Count as a proxy for duration will yield different results from test to test.

In Gatling, the threads and ramp up are set in the line setUp(scn.inject(rampUsers(threads) during (rampup seconds))).protocols(httpConf), where you replace the parameters threads and rampup with the values you want. You can set the duration by including this line of code:

val scn = scenario("scenario")

.during(duration seconds) {


//Requests here




and replacing duration.

Adding test data

Using the same data (such as user credentials) in your test may yield faster-than-average response times due to server caching. To avoid this, you should consider using a more diverse data set. Using a variety of data can also expose issues seen in production that arise due to data being in different states.

For example, a financial institution running a load test for mortgage origination might well get excellent response times when testing the same user on a “happy path”: a user who has filled out all the information for a loan and has passed the identity checks. However, using a larger set of users that is more representative of production may show that a user who does not fill out enough information for the identity checks experiences very slow response times as the application waits for the responses from the identity verification service in the background.

First, you’ll need a CSV file (other formats are available, but this one is the most commonly used for this purpose). Here’s an example of what this could look like:






Save this file as users.csv.

Next, you’ll need to modify your script to read these values.


In JMeter, you can add a CSV Data Set Config element to add this functionality to your scripts.

Click on the Test Plan element > Add > Config Element > CSV Data Set Config. Then click on the newly created element. The only thing you’ll need to change is the filename to include the path to the CSV file you’ve just created.

If you don’t put anything in the “Variable Names (comma-delimited)” field, JMeter will by default use the first line of the CSV file as the variable names.

Then go to the request where you want to use those values and replace the hardcoded values with the variables you want to use— in this case, ${username} and ${password}. Note that you wouldn’t really pass these in the clear like this in your application; this is just a simple example to get you started.

When you run your script, you can confirm whether the request has picked up these values by clicking on the View Results Tree listener, clicking on the HTTP request and clicking the Request tab. You can see below that the script has automatically used the values from the first line of our users.csv file.

If you have a lot of test data that you’re using this way, one way to see at a glance what the current values are is to add a Debug PostProcessor by clicking on the sampler (HTTP Request) > Add > post Processors > Debug PostProcessor. Next time that you run your test, you can click on this post processor and go to the Response data. It will have all the current values of the variables you’re using. This can be a great way to debug between multiple iterations. Just remember to disable this while you’re running your full-scale load test.


We’re going to use the same users.csv file that we created earlier, but in order for Gatling to recognise it, it will need to be in a particular folder. The default path for this folder can be viewed or changed in /conf/gatling.conf:

#data = user-files/data # Directory where data, such as feeder files and request bodies are located (for bundle packaging only)

It’s worth it to check this as some later versions of Gatling may use user-files/resources instead. Assuming you have this set as above, create a data folder under user-files. Then transfer the CSV file into /user-files/data.

In Gatling, you’ll need to first set up what’s called a feeder, which is basically a way to get data from a file and bring that data into the script:

val csvFeeder = csv("users.csv").circular

Note that this line is for CSV files particularly, although there are other types of feeders that you can use with Gatling. If your users.csv is not in the default data directory, you will need to include the filepath. For example, when running this on Flood you’ll need to change this line to:

val csvFeeder = csv(“/data/flood/files/user-files/data/users.csv").circular

But for now, since we’re just running this locally, you can leave it as is.

The circular at the end of the line tells Gatling that when the script runs out data to consume (such as when you have five users and only four lines of data), it should go back to the beginning and re-use the same data.

Now, to use the data, you’ll need to refer to it by the headers on the data file itself. In our case, it’s username and password, so here’s how we could use that data to pull in a line of values into a post request:





Of course, this is a simplified version of a login, but the principles remain the same. Here’s what that looks like on Flood:

To follow along, you can download this sample Gatling script here.

Using different user paths

Your basic script will likely have one business process that all your users follow. This could be, for example, a path that a user takes from landing on the home page to browsing a catalog of items for sale to actually adding the item to a cart and checking out. While this path actually exists in your application, one thing you might consider is alternative user paths.

For example, perhaps instead of browsing, a user searches for a particular item, finds that it is out of stock, and signs up to be notified when the item is back in stock. It might be worth thinking about scripting a separate flow to account for users like this, depending on how common this situation may be.

But how do you add that into your load testing?

In JMeter load testing, one common way to do this is by simply adding another thread group and then scripting another path inside that thread group to run alongside your first thread group.

An advantage of this is that you can set the number of users for each thread group separately, so you can control the ratio of the different paths that you decide to script.

Last updated