walrus.ai will be expecting a
JSON payload in your request body when triggering a test run.
This payload is responsible for specifying what is being tested, how it's being tested, and any additional data
that may be necessary to carry out that testing.
|name: ||A name to associate with the test. This will be displayed in your results.|
|url: ||The URL of the web application being tested.|
|instructions: ||The sequential series of steps that should be taken to carry out the test.|
|variables: ||A map of variables to be interpolated with the test instructions (see Providing Credentials below).|
|revision: ||An optional revision tag, such as a commit id, to associate with the test. This will be displayed in your results.|
1walrus -a YOUR_API_TOKEN -u https://my-app.com -n 'Search' -i \2 'Enter "walrus" in the search bar' \3 'Hit enter' \4 'Make sure results are displayed'
There may be times when you need to specify credentials or other data for walrus.ai to use when executing an integration test. Some examples could be providing a test credit card number, or a specifically prepared account to be logged in to.
Any time you find yourself needing to provide additional data, use the
variables request parameter to pass in the additional data.
These variables can then be referenced directly in your test url or instructions using the format
:variable_name: (note the colons before and after).
By default, walrus.ai considers data passed in via the variables object to be sensitive — it will not be included in the dashboard or integration results (Slack or Custom Webhook).
If you want to add non-sensitive metadata in your test, you can add a
! to the end of the key name.
1# Run with "walrus -a YOUR_API_TOKEN -f /path/to/this/file.yml"2---3name: 'Sign-up Test'4url: 'https://my-app.com'5variables:6 username!: 'username'7 password: 'password'8instructions:9 - 'Sign up with username :username!:, password :password:'10 - 'Sign the terms and conditions'11 - 'Go to inbox'
Sometimes, you may want to add metadata from the environment. When using the walrus.ai CLI tool, you can interpolate environment variables directly in your test files.
1# Run with "walrus -a YOUR_API_TOKEN -f /path/to/this/file.yml"2---3name: 'Sign-up Test'4url: 'https://:environment!:.my-app.com'5variables:6 environment!: $DEPLOY_ENVIRONMENT7instructions:8 - 'Log in'9 - 'Sign the terms and conditions'10 - 'Go to inbox'
Imagine our company offers an email client as an alternative to Gmail. Our product has a lengthy onboarding flow and we want to ensure that new customers can finish the flow regardless of any future updates we make. This sounds like the perfect scenario for an end-to-end test!
First, let's walk through what steps an end-user would need to take to complete this onboarding process:
- Navigate to https://my-app.com
- Click "Register"
- Sign in with Google (third-party flow)
- Read + accept legal forms
- Proceed through a multi-step product tour
- Click "Okay" to continue to the inbox
- Compose an email
Since our instructions for walrus.ai are just an array of string instructions, translating these steps into an API request should be straight forward.
1walrus -a YOUR_API_TOKEN -u https://my-app.com -i \2 "Sign up with Google" \3 "Sign the terms and conditions" \4 "Proceed past inbox tour" \5 "Proceed past compose tour" \6 "Proceed past settings tour" \7 "Go to inbox" \8 "Compose an email and make sure it saves to drafts"
There's one big thing to note here: the instruction granularity is completely up to you.
In a traditional end-to-end test, you would have to specify your instructions just as a computer would expect them: step-by-step, with no room for interpretation or judgment. Thanks to the combination of walrus.ai and human judgment, such specificity no longer necessary. Our complex onboarding flow, which spans navigation, third-party authentication, and form submission are all handled with a single instruction: "Sign up with Google".
However, when you do want increased granularity, perhaps for making stricter or more numerous assertions, that's also easy to accomplish with walrus.ai. In our example above we specified each individual product tour step that we'd expect. Such specificity ensures we don't erroneously stop displaying a specific onboarding step to users.