Testing Handbook

The Basics: Writing Instructions

The purpose of this document is to highlight the common reasons tests fail and how to avoid them, as well as the use-cases that walrus.ai supports.

If you're looking for the basics on how to write instructions, head here.

Supported Cases

File Upload

When wanting to reference a file during a test, you'll first need to upload the file in Test Resources. The name of the file will be referenced within your instructions.

name: 'Photo Upload'
url: 'https://etsy.com'
variables:
icon: 'walrusicon.png'
instructions:
- 'Click the upload button'
- 'Select the file:icon: file'
- 'Click upload'
- 'Verify the selected file uploads'

Results:

uploading a file in an integration test

Checking emails or authenticating with Google

For login purposes or to verify emails, we can use the following emails, along with creating alias emails:

instructions:
- 'Click Add Members'
- 'Enter a random alias of winona@walrus.services. Click Invite'
- 'Verify an email is sent to winona@walrus.services'

Results:

Sending the invite:

sending an email in an integration test

Receiving the email:

receiving an email in an integration test

Checking an SMS

Verify if an SMS has been sent or send a code as an SMS. To verify SMS messages are received, please use the below phone number:

805-369-1060

Example:

variables:
phone: '8053691060‬'
instructions:
- 'Click "Confirm my phone number" and enter :phone:'
- 'Verify a code is sent to :phone:'
- 'Enter the code and submit'
- 'Verify login is successful'

Streaming Video Media

With walrus.ai, we'll automatically stream video and audio during test executions that require them.

Example: joining a video call

instructions:
- 'Click event'
- 'Click Join with Google Meet'
- 'Click camera icon'
- 'Click Join Now'
- 'Verify Video call is joined with user avatar in center'

sending a video invite

Multi-user / multi-session flows

Tests can be written with both multiple tabs, as well as multiple browser sessions.

Example: Multiple tabs

If you request to create a new tab, we'll do so in the same browser session.

variable:
new_url: 'https://amazon.com'
instructions:
- 'Search Kindle'
- 'Select the first result'
- 'Open :new_url: in new tab'

creating a new tab in an integration test

Example: Multiple browser sessions

If you add an instruction to take any action as a second user, we'll do so in a second browser session.

variables:
user1: 'winona@walrus.services'
user2: 'winston@walrus.services'
url: 'https://notion.so'
instructions:
- 'As :user1:, type in Hello'
- 'In a new browser session, log in to :url: as :user2:'
- 'As :user2:, confirm you see Hello under Test space'

Results:

Browser session 1:

creating a new browser session in an integration test

Browser session 2:

switching browser sessions in an integration test

Top right will indicate which session is in view out of how many sessions:

which session is the test in

Slack integrations

Validate if slack is integrating successfully

variables:
slackworkspace: 'walrusai.slack.com'
login_user: 'winona@walrus.services'
instructions:
- 'Toggle Connect Slack channel'
- 'Sign into :slackworspace:'
- 'Sign into Slack with Google with :login_user:'
- 'Set #random channel and allow permissions'
- 'In a new tab, log into Slack'
- 'Continue with browser'
- 'Click #random channel and confirm Welcome to Notion! appears'

integration with slack in an integration test

sending a message to a slack channel

File download

Download files (can include specific extension)

instructions:
- 'Find .pdf file on Recent list'
- 'Click 3 dots next to file'
- 'Download'
- 'Confirm file downloads'

downloading a file in an integration test

Chrome Extensions

In order to test a Chrome extension, first steps include adding the chrome extension as part of your Test Resources

adding Chrome extension to test resources

When adding a new extension, you will be asked to upload it as an archive zip folder or install from a link, provided with the name of the extension. This extension name will be referenced in order to add the Chrome extension to your test models.

uploading Chrome extension as a link

uploading Chrome extension as an archive

After your Chrome extension is added to your test resources, you can now add the Chrome extension on a per test model basis. They can be added to existing tests or added as you create new tests.

adding Chrome extension to model

You can also add a new extension through the test wizard as you are creating test models.

adding Chrome extension through test wizard

Once your Chrome extension is added, you can refer to your Test Resources to identify which models include your Chrome extension.

Chrome extension test models

Under actions, you can edit your Chrome extension. Once you change the name of your or the contents, it will be reflected across all tests.

editing Chrome extension

After associating the extension with your model, if the extension is embedded in the Chrome header, refer to the extension within your instructions when specifying to interact with it. Be sure to refer to the extension in order to avoid ambiguity with respect to what part of the browser is being interacted with.

In short, you can follow these guidelines to improve the clarity of your test:

  1. When you need to interact with the extension, make sure you provide an instruction to click on it (if needed)
  2. When interacting with an extension popup, preface your instruction with in the extension popup
  3. When referencing the main browser window, preface your instruction with in the browser window

Other considerations:

  • If the extension popup blocks another instruction from being completed, we will click on the extension in the header to remove the extension, or exit out of the popup itself.

A GOOD INSTRUCTION:

Both the assertion and the interaction with the Chrome extension are clearly highlighted

instructions:
- 'Click on the Walrus Chrome extension'
- 'In the extension popup, sign-in with Google with :example_user:'
- 'In the browser, verify "Welcome!" loads'

A BAD INSTRUCTION:

In the below instruction, it is not specified which "Sign in with Google" button should be clicked (the browser or the extension), nor is it specified in which window the assertion is being used.

instructions:
- 'Click on the Walrus Chrome extension'
- 'Click sign-in with Google'
- 'Verify "Welcome!" loads'

Unsupported Cases

Below are a few use-cases we currently do not support at the moment:

  • Mobile testing
  • File drag and drop
  • Ephemeral assertions (making an assertion on an element that loads, but then subsequently disappears)
  • Verifying content within downloads
  • Making API requests

If you have any feature requests, please reach out to support!

Common Issues and How to Fix Them

Test concurrency issues: running one test causes the other to fail

Example:

Test A: change password of user1

Test B: logging in as user2, and changing the profile picture

If both tests were to run at the same time, Test B would fail due to not being able to log-in or not being able to continue due to an updated password

How to fix concurrency issues

In order to avoid concurrency issues, follow one of the below steps:

  1. Use separate environments/accounts to avoid the issues ie. run the change password and forgot password tests on separate accounts so they don't interfere with each other.
  2. Schedule the tests (either using the CLI, or in our dashboard) to run separately.

Test Teardown: a failing test causes the test to fail in a subsequent run

Sometimes, tests executions will alter the test environment that they are in, such that the next time the test is run, it will necessarily fail.

If a test will fail when re-run, you will need to include teardown instructions to clean up the test environment.

Example: Test A without Teardown

instructions:
- 'Click create list'
- 'Name list "To do"'
- 'Verify there is only one list on the page named "To do"'

In the first run, we will be able to successfully complete these instructions. However, in the second run, this test will fail as the name list "to do" is already taken. Therefore, we'll need to include instructions to clean the testing environment.

How to fix teardown issues

To fix teardown issues, include the teardown instructions in the test, at the beginning of the test, so they are run every time.

To increase the flexibility of these, include the instructions as conditionals, so they can handle both the situations where the teardown is needed, as well as the situations it is not.

Example: Test B with Teardown

instructions:
- 'If a list loads on the screen called "To do", click delete on the list'
- 'Click create list'
- 'Name list "To do"'
- 'Verify list is made'
- 'Verify there is only one list on the page named "To do"'

As you can see, the first instruction contains conditional teardown. If a list is called "To do", we'll delete it, otherwise we'll just continue to the next instruction.

Static IP Addresses

If your staging environment blocks unknown IPs, you can add the static walrus.ai IP addresses to a safelist so we can access the environment to execute tests. The IPs can be found below:

35.199.184.228
34.82.88.124
34.105.102.82
35.185.212.108
35.230.37.106
34.83.51.217
35.185.237.23
34.105.79.165