Links

Getting started with Gatling

Running your Gatling script at scale through Flood

Requirements

To run your Gatling script on Flood, you'll need:
  • A working Gatling script with a .scala extension (we suggest running it locally to confirm that it works before running it on Flood)
  • (optional) If you want to set test parameters on the Flood UI, you'll also need to modify your scripts with specific properties
  • All related files that are referenced by your test plan, including test data
You will not need to upload the Gatling directories or Java; Flood will make sure these are copied into the nodes before your test.

Create a stream

Follow the instructions in Scripting your load test to create a new stream for your test.

Select Test Type

Instead of the Test Builder option, select Script Upload.

Upload Test Scripts

Click the Choose Files button and select the .scala file containing your Gatling script.
If you don't yet have a Gatling script, here's a sample one that you can use as a test. You'll need to unzip it and upload just the .scala file to Flood.
GatlingExample.scala.zip
1KB
Binary
Sample Gatling 3.0+ script for Flood
Repeat this step to upload all other resources you require. Everything you upload here will be saved in each node under /data/flood/files. Make sure you only have one .scala file in the parent directory. If you have other .scala files that your main script refers to, you can upload them within a zipped directory and refer to them accordingly in your main script.
Wait until the resources are uploaded completely. If this takes more than several minutes, consider reducing what you're uploading.

Select Tool Configuration

Select Gatling in the Tool Configuration step.
This is necessary for ensuring that the correct tool is deployed on your nodes.

Add Advanced Parameters

Optionally, you can type in advanced parameters that will be passed to Gatling upon runtime of your script.

Configure Launch parameters

Click Configure Launch and you'll be taken to the Launch tab, where you'll set values for your test.
Launch tab of the Stream Editor
First, choose a region. This region is the geographical region (corresponding to AWS availability zones) that your nodes will start in. Flood will determine how many nodes you need depending on the number of users you select. For your first test, we recommend selecting only one region.
Next, move the slider or enter the number of users that you'd like Flood to start in each region. Every region will have the same number of users. We suggest you run fewer users to avoid unnecessary costs during shakeout.
In Duration, you can choose how long you want the test to run. This does not include the amount of time it takes for the grid to start. We recommend a duration of 15 minutes or less for your first test.
In Ramp Up, enter the amount of time in minutes for the test to gradually ramp up to the number of users you set. You can generally set this to 0 for small shakeout tests.
The Summary section will display the total number of users your test will start as well as the number of regions, VUH used, and dollar cost. The VUH and dollar cost are estimates only-- if you haven't yet used up your free 500 VUH, you won't be charged for this test.
Note that in the screenshot above, we've entered 1 user per region but the Summary displays 2 total users because we've chosen to run in two regions. The number of regions selected and the number of regions will be multiplied to calculate the total number of users.
Not seeing the summary with your cost estimate? Double-check that you're using test parameters in your script.

Execute your test

Finally, click Launch Test. In a few minutes, your test will begin executing.