The effectiveness of an e-learning tool can only be judged by how well it delivers a course content to educate and train its intended set of learners. But grading their performance at the end or asking them to fill out multiple surveys are not the only ways to judge its effectiveness.
How To Use Qualaroo and Optimizely TogetherJanuary 25, 2013
We are big fans of Optimizely for running A/B tests on web pages. Like Qualaroo, it is one of the tools that empower marketers to be less dependent on engineering to improve website conversion rates. In fact, the two tools become even more powerful when you use them together. Many of our customers use Qualaroo to identify conversion issues and then apply those insights to their next Optimizely test.
For example, a Qualaroo question can appear at the point where someone is about to download software asking: “Is there anything preventing you from downloading the software at this point?” The answers to this question can help you create a much more effective test version of the download page that addresses real customer issues. One customer recently shared that this approach helped them create a page variation that doubled the download rate in a single test.
Since so many of our customers use both Qualaroo and Optimizely, we’ve now made it easier to get even more value from the combined toolset.
Target Surveys to a Specific Optimizely Variation
You can now target a survey to visitors who are assigned to a particular variation of an Optimizely experiment.
How to Configure This
- Find the experiment ID and the variation names in your Optimizely dashboard:
- Enter the id of the Optimizely experiment and the variation name in survey configuration:
What Can You Do With It?
This integration allows you to do two things:
1. Use Qualaroo surveys to understand why one Optimizely variation performs better
Suppose you run an A/B test on a page that explains the pricing of your product, and a certain variation ends up converting better. Knowing why makes it easier to evolve the winning variation further. This is especially true when the difference in performance is not drastic. Ask your visitors who saw each variation – “Is our pricing clear? If not, what did you find confusing?” Targeting a Qualaroo survey to users who saw a particular version of the page helps inform your next experiment.
How to do it?
Create a survey for each variation. This means that the survey will be displayed only when (and where) this experiment is active. Our delay option can give your visitors enough time to read the text before you ask the question.
2. Use Optimizely to A/B test Qualaroo surveys
Several customers expressed their interest in A/B testing surveys against each other in order to find the wording and the order of questions that gets the most high quality responses. Now you can use Optimizely to run this experiment.
How to do it?
Create an experiment in Optimizely with two variations. The variations will not modify the page itself. Configure two surveys on the same page. Target each to a different variation. Done. Now Optimizely will assign some users to one survey and some to the other. Wait for enough data and analyze the number and quality of responses that each version of the survey received.
This feature is available in our Small Business and Professional plans. The plans come with a 30 day free trial – sign up and contact us at email@example.com to enable this feature for you. If you are already on one of these plans, just email support if you would like to give this a try.
Measuring customer satisfaction is becoming more and more important with today’s highly aware and actively involved customers, because they have multiple platforms on which to share their views, both good and bad. This in turn can have a remarkable influence on other prospective customers about your business’s products and services.
Today we’re thrilled to introduce our newest feature, Templates! With Templates, you can now ship your Nudges faster and more confidently than ever.
You probably have a firm grasp on some of the universal metrics of SaaS success: ARR, growth rate, churn rate, CAC, LTV, etc. There is no doubt that these are critical, but in many ways these metrics do not tell the whole story of “success”. So what’s the leading indicator that can give you a fuller picture of success? Your customers’ satisfaction.
We grouped our list of 29 questions into the different topics you should consider exploring in prototype testing. Aim to choose at least one from each section to make sure all your bases are covered. We’ve also included a few pro-tips here as food for thought.
This article will cover the major Dos and Don’ts of prototype testing. We’ll walk you through the most common mistakes we see in the field and share tips on how to avoid them.
Testing prototypes is an inherent part of finalizing designs. Nobody wants to wonder why users are not utilizing an app the way it should be utilized or why they can’t seem to complete a purchase on your website. And nobody wants to rework something that’s already been shipped.
UX designers are under a lot of pressure to produce designs that add value to users’ lives. But without input from your users, it’s nearly impossible to design an experience that actually helps alleviate their pain points. If you’re pressed for time and/or don’t have the help of a researcher, getting the user input essential to design a great product can certainly be a challenge.