- Prep for the Project Manager
- Prep for the QA Engineer
- Types of Tests
- Full Browser Test
- (Basic) Functionality Test
- Running QA Tests and Reporting Issues
- Naming/Labeling Issues
- Using cross-platforms to host media during QA
- If Reporting Errors in Basecamp
- A Word on Re-Testing
- Accessibility Audits
- Design Accessibility Audits
- Development Accessibility Audits
Prep for the Project Manager
- Review for a walk-through on how to create an assignment for a QA Engineer, including a list of all materials needed.
- provides a walk-thru on what to expect during the QA phase of a projectQA & Project’s Lifecycle
- Refer to the during emergencies where the QA Engineer assigned to your team is unable to test a project when neededQA Contingency Plan
Please note that a failure to properly create an assignment for the QA Engineer will result in a delay in QA testing being run. 🤷♂️🤷🏻♀️
Prep for the QA Engineer
- Your basic set of tools includes:
- URL(s) from where to test
- Site Documentation
- Note: The Site Documentation functions as your QA checklist and is your best friend as a tester. It tells you exactly what to look for and what to test.
- As a QA engineer, part of your responsibility is to ensure that this list is accurate and up-to-date.
- Issues repository for where to report any issues you may find
- Cantilever primarily uses GitLab. In rare cases, we may use Basecamp
- Note: The procedure Set up a GitLab Repository explains how to set up a repository for clients
- Deadline for when they need the job done by
- Estimate of hours budgeted for you to get your job done
- Issue-Reporting template (You already have this: See below.)
- Information on what kind of browsers to test in a Full Browser Test can always be found in a project's Dev Notes
- BrowserStack, in addition to any direct access you may have to various machines and browsers
Types of Tests
Full Browser Test
Test on all browsers listed under the project's Dev Notes. If there is any uncertainty on what browsers to test, simply ask the DRI!
(Basic) Functionality Test
You can use Google Chrome.
Things to Keep in Mind
- Important: Make sure you refer to the Browser Support list in the project's notes for a particular Client or Project (also found in Dev Notes). Sometimes the agreed-upon browsers for a particular project may be different than our standard set of browsers, so always pay attention if there are any differences for a specific project.
- A Word on Mobile Devices: For mobile devices, you can just check one of each type (iOS Safari and Android Chrome) unless you run into issues that you need to cross-check. This is pertinent especially for Androids: they used to be highly variant but now are pretty standardized.
Running QA Tests and Reporting Issues
- Run through tests as described in the Project Documentation in the appropriate browsers.
- Ernesto (a former team member) made a really great template to use when reporting issues. Seriously, use this template. You can find the shortcodes for the template in TextExpander:
- Shortcode for Markup Template used in Git repositories (GitLab, etc.):
- Shortcode for Template used within Basecamp:
- When posting issues in Git repositories, make sure that you list all of the issues in a Milestone. (You may need to create this.) The milestone in the repository should be titled after the milestone being worked on in the Basecamp project. Simple, right!?
Important: You are not required to fill out the entire form if it is not needed. Fill out the sections that are pertinent to your report and delete the rest.
- Names should toe the line between being overly generic and overly specific. It is important that issues have names that are quick to read (without being too generic) and that they convey the specificity of the issue (without being too wordy). A good rule of thumb is as follows:
- Use the Project Documentation for noting the general area that is affected by the bug. As much as possible, refer to the Project Documentation for cues on what this area is called. List this area in brackets.
- In as few words as possible, list the specific item that has a bug or action that needs to happen.
- If the bug is browser-, device-, or layout-specific, add an en-dash and then list the specific thing.
- Universal bug (or action item): [Area] Issue/Bug/Action
- Type-specific bug
- [Area - Browser(s)] Issue
- [Area - Device] Issue
- [Area - Layout] Issue
- From Project Documentation & Issues Reported:
- From Project Documentation & Issues Reported:
Using cross-platforms to host media during QA
When posting a video capturing an issue, it is preferred to upload them to the related Gitlab issue directly. However, there are times when the file size of a video capturing an issue is too large to post in Gitlab. There are several workarounds for this, but one of them is to post the item within Basecamp in the pertinent QA task. To save the dev from having to scroll up and down to find the right video, simply click on the date of the related comment post that the video is embedded within and it will update the URL with a hard link to that particular comment in the page. Post this new URL within the issue in Gitlab to make the media easy to find.
Whether you are using the app or the website, by right-clicking the date you get a contextual menu with the option to copy the link address directly without needing to navigate to it first:
In general, with many sites that are built around time-stamped comments (though not all), the same paradigm holds true that the timestamp is the permalink to the comment. Basecamp, StackOverflow, Reddit, Facebook, etc.: They all do this.
If Reporting Errors in Basecamp
Reporting issues within Basecamp should be done rarely.
- Create a separate list for these issues. All issues related to one to-do should get a group title that incorporates the name of the parent to-do.
- This is especially where using the Issues Template comes in handy.
Example: If the original task is: "QA Special NGO page" then the group name for the issue repository should be "Special NGO page Issues"
A Word on Re-Testing
It is always your judgment whether to test bug fixes as a functional test or as a full browser test. Most bugs are global and affect all browsers, therefore, a fix verified in one browser implies that it‘s fixed in all. Things that are aesthetic and not functional generally should be re-tested in a smattering of browsers though.
For instance, if a link is broken, that definitely only has to be re-tested in one browser. If an icon is the wrong size, that’s worth looking at in a few browsers.
Browser-specific bugs should obviously be re-tested in the original browser.
Audits are different from but related to QA tests. The purpose of an audit is to find areas where a site or feature is not in compliance with WCAG standards (WCAG AA is typically considered the threshold for "ADA Compliant"). The QA engineer may make recommendations to improve or fix these issues.
Read more about Cantilever's commitment to accessibility in
Read more about Cantilever's developing history with accessibility in
There are two different kinds of Audits:
Design Accessibility Audits
The purpose of a design audit is to find and address areas of non-compliance before they even make it to the development stage. Here are some related resources:
Development Accessibility Audits
The purpose of a development audit is to find and address areas of non-compliance in how the site or features function. Here are some related resources: