QA Blog Posts

QA Image Library

Check out the growing QA Image Library. This is my personal colleciton of Slack Images for that perfect QA Moment.

August 3, 2021

QA Fail: Mass Pike Sign

August QA Blog Content

Throughout the month of August, I'll showcase some real life QA fails. These are signs/things that certainly would have failed QA.

If you have driven down the Massachusetts Turnpike, chances are you might have seen this sign:

Mass Pike Transponder Mobile

Photo Credit: This is a screenshot from Google Maps.

Mass Pike Transponder Desktop

Photo Credit: This is a screenshot from Google Maps.

This sign appear multiple times on the Massachusetts Turnpike. They usually appear right after the electronic tolls area.

Four Reasons This Fails QA

Who is this sign for??? Here are the four reasons that QA would fail this sign:

  1. The sign is way too small to comprehend while driving 55+ mph.
  2. Who would have time to write down the phone number.
  3. Who's going to write down the URL? Let alone remember the URL for later.
  4. Most vehicles on the MassPike don't have passengers - which means you can't take a picture of the sign. In Massachusetts, its illegal to hold the phone while driving.
July 27, 2021

QA Graphics

Here’s some QA graphics that I found in various QA channels. Check out the QA Image Library for a huge selection of Quality Assurance graphics.

This month I focused more on quality graphics. I didn’t want to throw up images just to have a blog post. In QA, it's about quality not quanity.

  • Testing in Production
  • Quality Testing Logo
  • Wahoo Time for Testing
  • (FireFly Design) - Don't Let the Bugs to Escape.

Testing In Production 2021
What Testing in Production Looks like.

Quality Testing 2021 logo
Quality Testing (Logo?)

Woo Hoo Testing Time
WooHoo Time For Testing

Dont Bugs Escape
Don't Let the Bugs Escape!

July 20, 2021

How to create great CTAs using A/B testing

A vector image that illustrates A/B testing.

If you’ve delved into Landing Page Optimization (LPO), chances are you’ve pondered over your Call to Action (CTA) buttons. Even if you haven’t, you’ve likely given them their due time. CTAs are your literal calls to action, and thus serve as your final conversion step. That’s what all your marketing, social media, and link-building efforts aim to push audiences to do. Where great CTAs can do them justice, underperforming CTAs can severely devalue your efforts. So, how can you fine-tune them for better conversion rates? Simply, by testing their relevant attributes one by one. In other words, you may create great CTAs using A/B testing " and here is how.

What are CTAs?

But before we explore the main subject, let us briefly define those two terms, starting with CTAs.

In brief, CTAs are buttons on landing pages that encourage visitors to take the desired action. Examples may include:

  • Signing up for a newsletter
  • Downloading a free copy of a report
  • Purchasing a product or service

Conversions and conversion rates

In turn, effective CTAS are invaluable to have a greater number of conversions on your landing pages. As CTAs are direct calls to action, they are at the forefront of LPO and Conversion Rate Optimization (CRO). Put differently, they are the final arbiter of whether a visitor converts or leaves.

That said, there are different types of conversions, and each calls for a different approach to CTAs. Specifically, Google Analytics identifies two distinct conversion types:

  1. Micro conversions. These are conversions that push the visitor further into your sales funnel, such as newsletter signups.
  2. Macro conversions. Instead, these are completed transactions.

Then, it identifies four distinct conversion groups:

  1. Acquisition.
  2. Inquiry.
  3. Engagement.
  4. Revenue.

Among the four, only “revenue” refers to macro conversions. The other three are likely earlier in your sales funnel, so they will require different CTA styles and copy.

What is A/B testing?

Now, to create great CTAs using A/B testing, we’ll also need to define A/B testing equally briefly. A/B testing, as the name implies, is the process of testing two variations of a page against each other. Typically, the process relies on modifying a single page element, and then directing traffic to both the original (A) and the modified version (B).

Understandably, A/B testing can’t succeed on a hunch. Instead, it typically requires such assets as CRM software and heat maps to inform which elements are modified and how.

Finally, A/B testing is one of 3 types of page testing:

  • A/B testing
  • Split testing
  • Multivariate testing

Granted, “A/B testing” and “split testing” see interchangeable use, but many do distinguish the two.

How to create great CTAs using A/B testing

With the above in mind, we may now discuss our main subject. To do so, we’ll need to split it in two; the process itself, and the CTA modifications.

The A/B testing process

First and foremost, the process itself is crucial. Indiscriminately testing CTA elements, or testing many elements at once, will very rarely yield reliable results. Instead, you may take a more calculated, focused approach.

#1 Decide on your elements

Initially, you should decide which specific elements you will test. CTA elements to consider are:

  • Color
  • Font
  • Copy
  • Style
  • Placement
  • Shape
  • Size
  • Timers

With this many factors, you should understandably begin with a data-driven hypothesis. Thus, you may use CRM, heat maps, and any other analytics tools at your disposal to inform your choice.

#2 Create multiple variants

Then, you may test your hypotheses in action. To do so, you will need to create multiple “B” variations and test them individually against your “A”. Luckily, the digital age offers heat maps, so you may consider the following metrics they can provide:

  • How many visitors see your CTAs? Scroll maps will reveal how many visitors scroll to your CTAs.
  • How many visitors identify your CTAs and react to them? Move maps will provide these insights.
  • Do enough visitors click on your CTAs? Click and touch maps will reveal how many clicks and taps your CTAs get.

#3 Monitor your results

Finally, you should take the time to properly analyze your tests’ findings. As you do, ensure you build the framework for more, and more conclusive tests. For this step, consider the following:

  • Ensure you have a decent sample size to draw conclusions from.
  • Let your tests run for long enough to ensure accuracy.
  • Identify which specific elements led to success or failure each time, and use them for further tests.

The CTAs

The process aside, creating great CTAs using A/B testing also hinges on CTAs themselves. While exact modifications should absolutely rely on your own analytics and tests, there are a few set practices to consider. In no particular order, these are the following.

#1 Refine your copy to encourage action

The very purpose of a CTA is to inspire action, so your copy must reflect this. Thus, lead with imperative verbs; “sign up”, “get this”, and “click here” are CTA staples for a reason.

At the same time, visitors must know exactly what they’re getting out of it. To do so, make your proposal clear; frame your CTA with clear information so that visitors know what you’re offering.

#2 Keep it simple

However, clarity comes from simplicity; long CTAs are often visually unappealing. Thus, you should strive for copy simplicity:

  • Keep it between 2 to 5 words
  • Avoid jargon
  • Use adjacent page space for more information

Of course, simplicity is also relative to what your unique audience expects. Average audiences might dislike jargon, for example, but invested, savvy audiences might appreciate it in moderation.

#3 Keep your CTA visible and above the fold

On the subject of placement, CTA visibility is among the most common missteps. Here, consider such factors as the following:

  • Ensure your chosen font is highly readable
  • Use a contrasting CTA color in relation to the page
  • Place your CTA above the fold to ensure visibility
  • Avoid visual clutter around the CTA
  • Limit each page to one CTA to avoid decision fatigue

#4 Add a countdown timer to incite agency

Having ensured simplicity, clarity, and visibility, you may then consider countdown timers to incite urgency. The role of your CTAs is to inspire action, after all, and few things do so as successfully as urgency.

For reference, consider that Sleeknote found that popups with countdown timers outperformed ones without timers by 113%. In principle, the same applies to CTAs " which should explain the common “sign up now”, “hurry!”, and similar copy choices.

#5 Make it stylish" where appropriate

Finally, you may attempt to spice up your CTAs. Most notably, you may employ humor; for example, OptinMonster’s exit-intent popup cries “that’s abandonment!” as you leave. HootSuite’s exclaims, “well, this is awkward”, before continuing that they “could’ve SWORN you were someone” interested in their services.

These are evidently very effective, as their audiences find them charming. However, this kind of humor is in line with their branding, and resonates with their audiences. Yours may both differ, so when trying to create great CTAs using A/B testing, make sure to do your due research.

July 13, 2021

GoodHarts Law

Recently I was browsing around Sketchplanations collection and found this graphic around Goodheart’s Law:

Googharts Law

GoodHeart’s Law

When a measure becomes a target, it ceases to be a good measure.

QA View

Understand GoodHarts Law is useful for QA because it helps understand that without a clear objective - specifically one that benefits the consumer- can cause a loss of focus. Thus all the work and effort may cause more harm than good.

Some questions to ask when working on a project. Don’t wait for a retrospective to ask these questions. Think about these at the beginning of each sprint:

What are the goals of the project?

What solution are we trying to solve?

Will this solution actually fix the problem or will it cause more pain points to the customer.

Check out sketchplanations

Definitely check out other Sketchplanations drawings. They do a great job in explaining various topics.

July 6, 2021

Computer Change Up

Last week I got assigned a new laptop, and this week I'll be configuring all the various QA applications on it. One of the things that came to mind during this process is how many testing tools that I don't really use anymore.

This got me thinking that moving to a new laptop is a good time to do an assessment of the tools that I really need to have. Over the years, I tested various apps, but never uninstalled them.

Consider a Change

If you're using the same laptop/desktop for the past three years, you may want to consider changing a laptop/desktop before starting that next big project. It's a good way to start from fresh.

Five Things I Have Learned

  1. Getting the computer into a productive state takes time. It takes time to find the right software and licenses. Once you have the installation done, there some time needed to get the software configured to how you remember it - such as custom keyboard shortcuts and layouts.
  2. I used the opportunity to rearrange my physical desktop. The new MacBooks have USB-C ports and I can connect external monitors on either side. This allowed me to switch things around so I can have a more practical workspace.
  3. I have found that cloud services make migration so much easier. Dropbox and One Drive make moving files between the computers easier. Also, many of my apps settings are stored in DropBox, so they pick up various configurations when the applications are installed.
  4. Keep it simple. I am committed to not installing software on this computer that is also not essential to my work. My other computer had a lot of graphic apps that I really don’t need.
  5. Document Everything. I used various sources to store software license keys. As I am installing apps. I am recording them in a single app - so I have a single source for next time. Including documenting what apps are being used for what functionality. Think of it as a Wiki page for essential apps.
June 30, 2021

Manual Of Style

Sometimes QA will encounter some wording that may look a bit off. They may need some clarity on the reading of the text.

One of the best solutions is the Manual of Style by Wikipedia. This is the writing style guide that is being used for millions of webpages.

This is a useful reference to have when you're looking for support to a better solution than being implemented.

Manualof Style Logo

Example Format

How should you display the date?

April 5th, 2021 - June 4th, 2021

According to the Manual of Style, the easy to read format would be:

April 5 - June 4, 2021.

Stlye for the Modern Era

The nice thing about this style guide is that there's a whole section on hyperlink format. Check out the link section to get an idea on what text to link to.

June 23, 2021

Chrome Requests Response Headers

Sometimes QA needs to debug the HTTPd request the browser is making. This is usually to make sure some conditions are met or to figure out why something isn't working.

In Google Chrome this is pretty simple task.

Chrome Header Help

Finding the HTTP Header

  1. Right click on the page and select Inspect (Or type in Option Command I)
  2. Select the Network tab on the top of the DevTools page and then reload the page (Command R)
  3. Select any Name (These are the various request that were made during the page load time.)
  4. On the right, select the Header tab

This is where you'll see the request and response between your browser and the server. There's a lot going on here!

You should be able to get a lot of data that may help troubleshoot any issues.

Why Should QA Know This?

QA should have the knowledge of all the tools in the browser toolbox. Debugging JavaScript, jQuery and Angular isn't enough. Sometimes, as part of validation, QA may need to read the HTTP Request Headers and verify that the browser is sending the correct information.

You can learn more about HTTP Requests Response Headers from DumbTutorials. It's a good place to catch up on everything HTTP.

June 16, 2021

Cobra Effect

The cobra effect is when you provide a solution to a problem that results in the problem becoming worst.

How the effect got it's name:

Many years ago in India there were a lot of Cobras and they were dangerous. The British government thought it would be better to encourage locals to catch the cobras. They ran cash incentives for people to kill them. However, the locals saw this as a business opportunity and started cobra farming. Once the British caught on realized that the incentive wasn't solving the problem they stopped the bounty. The farmers were stuck with a lot of cobras and ended up setting them free. The net result was a lot more cobras than before the bounty.

Cobra Effect Logo

QA Example

Here is a real-world example where I have seen the Cobra Effect in QA.

The $1 Patch Jar

There was a point in time that we were having too many Production Patches. Many of these patches were a result of engineering not providing detailed descriptions of the risk of the changes being implemented.

Management implemented a "$1 Patch Jar" where those that were responsible for a patch had to put in money. This was to encourage developers to test their code.

The problem was that people were putting in a lot more than $1 just to cover any future mistakes. As a result, Developers were more relax about testing their code knowing that they already "paid" for the bug and the patch count didn't go down.

The jar was discontinued. Management instead implemented more responsibility for Dev and added that responsibility to their performance review.

I intended to post a couple of examples, but I could only think of the one. If I think of something in the future, I'll sure to post that.

June 9, 2021

Automation

Automation Logo

Automation is not testing it's checking.

What is Automation?

Automation makes hunmans more efficient, not less essentials

Automation is not a silver bullet - it won't immediately increase productivity, but if approached correctly, it will eventually support increase releases, greater test coverage, and overall quality of the product.

The Great Debate?

There's this debate in the testing community on the purpose of automation.

There's one group that talks about how automation is testing. They argue that automation tests to make sure that the build is stable and any changes doesn't break existing functionality.

There's a different group that says automation is checking. They argue that automation is only checking predefined paths and not testing situations that may be unique to the change being made. While critical paths may be working, some other functionality may be broken.

Team Checking

I am with the team checking. I believe that the purpose of automation is to check the stability of the build. Once automation passes, QA can perform manual testing.

Automation can be limited to the scope of checking/validating. In my experience, it can be tricky to automation 3rd party integration. Manual testing can be more effective in validating functionality.

Automation Has Value

Saying "Automation is Checking" doesn't change the value of automation. Automation plays a valuable role in the testing process. Don't put your eggs in one basket and solely rely on automation as your testing tool. Automation complements manual testing.

Automation test cases should be designed so that the build/branch is stable for manual testing. It should be checking that the critical paths are working.

Automation Checking

Final Thoughts

There's no such thing as a manual or automated tester. Are you a manual or automated programmer?

June 2, 2021

June QA Images

It's been a while since I added some images to the QA library. Here are some new ones to my growing collection.

Be sure to check out all the QA Images in the QA Image Reference Library

/ Quality Assurance Logo2021
http://www.cryan.com/qa/graphics/2021/QualityAssuranceLogo2021.jpg

/ Reuse Automation Code
http://www.cryan.com/qa/graphics/2021/ReuseAutomationCode.jpg

/ Modern Testing Chalk2021
http://www.cryan.com/qa/graphics/2021/ModernTestingChalk2021.jpg

February 14, 2021

February QA Images

Here are some QA graphics that I created for fun. I did some searching online and was surprised that there are good options to choose from.

Feel free to use these in any email or Wiki page.

QA Rule1
QA Rule 1

Test Better
Test Better

Think Q A
Think QA

The Artof Q A
The Art of QA

Q A Reference Page2021 Email
QA Reference Page

Check out all the QA Images that I have in my QA Library.

December 28, 2020

The Best 2020 QA Posts

It's time to look back at some of the best posts of 2020. (Check out the 2019 Post.)

Top4 Q A Blog Post 2020

  • Slack Tips for QA - (February 11) Some useful tips and tricks to get the most out of Slack.
  • Dynamic Bookmarklets - (June 24) Great way to build a Bookmark that is time based.
  • Letter to the QA Manager - (September 9) An interesting letter that I found. Helpful in understanding the relationship of a QA Engineer and a QA Manager.
  • Best QA Advice - (October 14 ) Some great QA advice that I have gotten over the years.

2021 Goals

I'll keep posting useful QA Tips and Tricks that I learn. Most of the information is around the Software as a Service model.

I don't have any specific content targets. I'll keep the content going through February. I may take a break for some time to think of useful content to post.

December 21, 2020

QA Graphic Collection

Here’s some new graphics for the QA library. All month long, I have been collecting and using some QA memes.

Be sure to check out the entire QA graphic collection.

Developers Test Their Code

Quickstart Bug

Snoopy Welcome To Release Day

Release On Fire

December 14, 2020

Yagni

Yagni is an abbreviation for “You Aren’t Gonna Need It,” referring to code that does not add functionality to pass tests or meet requirements.

YAGNI

Four Reasons QA Should Be Aware of Yagni

Makes Code Hard to Read - Developers may leave unused code - perhaps for their testing or design process. When someone else picks up the code it could cause confusion as to what the functionality is doing. This is true for QA as sometimes they need to read the code review to understand how to unique exploratory testing around the functionality.

Removed QA-Only Code - Developers may add code to help QA validate certain functionality. This should get removed from the code, not commented out. This way the code doesn't accidentally get activated.

Adds Risk to the Code - If it's not part of the feature, then there's no reason it should go out in production. Keeping Yagni code in could open a back door to your application.

Applies to QA Test Plan - Check your test case repository. Are there tests that fit into the Yagni principle? Why are they are part of your test plan? Clear it out so that your tests don't look so overwhelming.

Interesting Side-Note

According to Google translate Yagni is Turkish for "That is."

December 7, 2020

Throw it over the wall

Some Developers use QA as their testing environment without testing their code first. In some situations, if the code compiled they thought it was safe for QA to test.

In software development, this is known as “throw it over the wall” testing. Developers feel that the Change is so small - they just give it to QA to test it.

Throw It Over The Wall

Not a Good idea

Bad news - that’s not what QA is for. At least not in an agile environment.

Developers shouldn't be sending code to QA without having some sanity testing. They should at least know that the change they are making are working.

Yes they should be testing their code.

What Should QA Be Use For?

QA should be used to test security risks, performance issues, vulnerability, and usability. It shouldn’t be the front line of any testing.

Developers should always be testing their code. They should have accountability for how their code works.