Posts tagged testing
The Science of Digital Landscapes

Calm down, reader. We're not telling you to go out and buy goggles and an Erlenmeyer flask (but you can if that makes you feel fancy). The science of digital landscapes is grounded in one simple method: testing. As elusive as website testing may seem, it's a method with important foundational principles.

A/B Testing

Maybe you've known for years that your company should be testing its webpages, but you don't know where to start. Some testing tools that are marketed as easy-to-use quickly turn complicated, and sometimes they don't allow the full spectrum of testing that you're looking for. In our experience as a digital marketing agency, platforms for testing are ever-evolving, but throughout our years of testing we've established some staple principles and approaches that anyone looking to improve their digital presence should follow.

1. Annotate Everything

Even if you don't know what to do with Google Analytics data, annotations are a crucial piece to both effective testing and keeping track of your website history. If you ever think, "Should I annotate this?" the answer is usually, "Yes." We often dig into old data looking for trends and would never be able to identify what kicked traffic up or down without a stream of relevant annotations. Screen Shot 2013-08-13 at 6.40.38 PM

2. Measure Results

With annotations in place, all you really have left is interpreting the data between and through these annotations. Figuring out what's really going on with your site and attributing that to a cause will continue the beautiful cycle of testing. Maybe you changed the color of a button, and suddenly navigation to the page referred from the button click drops off. This information is just enough to start back at square one and try a different approach.

3. Form Realistic Hypotheses

Sometimes it's hard to know what to test. As much as possible, let the data dictate what you test. Look for data drop-offs and low-engagement page elements. Rework huge eyesores, like walls of text and jarring readability roadblocks. Once you're in the regular practice of measuring results, you'll find yourself quickly collecting a pool of what to test and how to test it.

As a general rule, avoid forming hypotheses with "best practices." Your site users are your own, and they don't belong to A-list marketer/blogger Joe Schmo who wrote that post about always including purple unicorns in the footer. Nothing can inform what's best for your site like your data does.

4. Use only SIGNIFICANT DATA

"How long should we test this?" I dub this the question of the year, every year. My response: "Until the data is statistically significant enough to draw a conclusion."

Not everyone is a statistician. That's okay. Just make sure you've got one on hand to check your conclusions. Interpreting data is not necessarily easy, so it's best to leave any complicated analysis to the ones who can measure statistical significance.

5. Accept When You're Wrong

Hypotheses are going to be wrong. Many tests will show you that you didn't have it all figured out after all. Don't let this make you feel like a failure, or like you don't know your market well enough. There are too many factors outside of your control for you to always have a handle on how things should pan out. Being wrong in your assumption will ultimately land you on what does work. Let testing teach you about your audience. Relationships are hard!

We at Delegator know as well, if not better, than anyone else how difficult the entire testing process can be. We try our best to stick closely to these core principles and attitudes that we know will carry us through the frustrations, and, consequently, we land on many victories. Go ahead and make some adjustments. Approaching website improvements scientifically should alleviate some fears and uncertainty, and it's best to remember that your website can ALWAYS be improved, no matter how great you may think it is. Just ask the data.