Way back in 1990, Jakob Nielsen collaborated with Rolf Molich and developed a set of 10 Heuristics for User Interface Design. They have withstood the test of time–I still refer to them when I’m reviewing interfaces. Of all of them, I see this heuristic violated most frequently:
- Error prevention – Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action.
A short time ago, I came close to sending those ten heuristics to the good folks over at SurveyGizmo.
It was Friday afternoon. A deadline loomed large. I was testing a survey I created but I kept coming across problem after problem. I had already contacted technical support twice. To quote my colleague it was a “cascade of disaster”.
So I sent this nastygram to technical support:
Dear Technical Support–
Forgive me and fair warning to the poor recipient of these problems:
I am a pissed off, paying customer. I will try to contain myself.
I am about to launch a survey and have been conducting testing. I’m finding bug after bug in your product and while I appreciate workarounds, I’m tired them.
Your product has to work as advertised. Period. And if it doesn’t work as advertised, you need to prevent me from thinking it can do what it cannot.
This is my latest issue:
See question 4. User selects 1 or more items from the long list. The options in the list need to be randomized for each user.
See question 5. Selections from question 4 are piped in. User is asked to rank 5 items within the list.
Now, it’s bad enough the advertised numerical validation and min/max validation doesn’t work on piped values, but how you REPORT the data from this question is something out of The Daily WTF. But I digress from the problems at hand…
Export the results to CSV. The items the users rank aren’t tied to any values. I randomly see 1, 2, 3, 4 but have no idea what it corresponds to. Now, to be fair, I’m not surprised by this problem. There’s a bit of programming involved. And I thought, “Well, that stinks because it means I’ll have to export each individual answer. And yes, we’re expecting 350 and that will make a TON of extra work, but if it’s what I have to do…”
I discovered Issue 2 when I did that [exported an individual response].
Open any individual response. Let’s work with response ID: 123456
Look at how question 5 was reported:
(option: 2) 2
(option: 24) 3
(option: 25) 4
(option: 30) 1
(option: 9) 5
It’s baffling to me because the reporting values in question 4 are not set to display numbers.
Once again, I’m trying to cut Survey Gizmo some slack. I thought I could “decode” those option numbers by looking at the way they are listed in the question editor. Put another way:
1 = This is an option the user can select.
2 = This is another option the user can select.
3 = And here’s yet another option the user can select.
The answer is NO.
I gave this item a score of 2:
And here’s yet another option the user can select.
From the list above, you can see that it’s actually number 3.
I’m pretty sure the mapping would have worked if I didn’t have randomization in question 4 turned on, but I cannot turn off that option.
So here’s where all this has left me:
I cannot launch the survey because I can’t get ANY reliable data.
And that means I’m going back to a client to say: Yeah, sorry, we’re going to miss that launch deadline because the survey software we’re using doesn’t work as advertised.
Thus, I am a pissed off, paying customer.
I’d love to see some fixes in place by noon Monday (EST). What are the odds of that happening?
The odds ended up being shockingly good thanks to SurveyGizmo’s superior customer service.
Here’s what they did right:
- They responded to my email within an hour (thus meeting their service level agreement).
- They apologized! An apology can go a long way.
- They gave me a direct cell number I could call, but also offered to call me at a time that was convenient.
- They reviewed the other issues I had with the survey. They got a complete picture of my goals and the ‘solutions’ that had already been offered.
- They didn’t make excuses about how the feature I was using just, “wasn’t meant to do that” (even though it was true).
- They focused on putting a solution in place for me within 30 minutes to 1 hour. I didn’t have to do anything.
- They offered to teach me about the solution they put in place so I could recreate it again in the future.
- Most importantly, they delivered on their promise.
Should the SurveyGizmo user interface design be improved to prevent dreadful experiences like the one I had? Absolutely. But dreadful experiences are bound to happen. Having a technical support team who has training in customer service is critical if you want to retain customers.
Imagine what could have happened if they didn’t have stellar technical/customer service?
I was prepared to tell my boss that we should not renew our enterprise subscription. My boss would have listened to me. After all, there are plenty of low-cost survey tools in the market space. And we would have told our colleagues to avoid SurveyGizmo. They might have listened to us.
So three cheers to SurveyGizmo for taking on the pissed off, paying customer. I was absolutely astonished and deeply appreciative.