Tuesday, February 23, 2016

Just because no one complains doesn't mean all parachutes are perfect. (Benny Hill)

Does your organization make use of voice of the customer (VOC) systems? These typically combine traditional ways of getting customer feedback like surveys, give feedback links, and so on and display them all in one system. They usually try to automate everything as well, using things like textual analysis and machine learning. They also tend to feature lots of cool graphs, a snazzy dashboard, and all sorts of bells and whistles.

Personally, I think they’re wonderful. Hey, it’s all feedback, right? 

At the same time, though, I have run into a number of people over the years who seem to rely on this kind of feedback almost exclusively. Now, the various methods that make up a VOC system do have a number of strengths (numbers is always at the top), but they also have a number of drawbacks as well. 

So, what are some of those problems? I see three issues on the user’s end:

Knowing – First of all, the user needs to know that there’s a problem. In the lab, I often see users who think they have completed a task, but who actually have some steps remaining. I also sometimes see users complete another task by mistake, but be totally unaware that everything isn’t just peachy-keen. 

Another issue is work-arounds, a special problem for experienced users. They may be so used to doing things a certain way, they may not even be aware that their experience might have some issues, let alone complain about it.

A special issue for surveys is human memory. There is often a major time lapse between when a user has an experience and when they get surveyed. The chance of them remembering specific details can often be very low.

Articulating – Second, the user has to articulate the problem. Note this is not as trivial as it may seem. Believe me, I’ve been doing this for 30 years, and I still struggle figuring out exactly what went wrong in some instances during a test. Is this an example of poor cognitive modeling, Fitt’s Law, progressive disclosure, skeuomorphism? Now, imagine some grandma from Iowa trying to do something similar.

What you often get instead are very non-specific comments. Just as an example, it truly is amazing over the years how many times I’ve seen my particular favorite, “This sucks!” Not a lot of actionable information on that one, huh? (Just as an aside, one major strength of usability testing is that it allows follow-up to these vague responses.) 

Caring – Finally, the user has to care enough to share what they think with you. And that’s where those traditionally low response rates come from. In fact, would you believe that some companies are actually happy if they get a rate of 0.5%? Wow, how representative can that be?

So, who does fill out these things then? Well, there is typically a fair amount of self-selection going on. You might get haters, or fan-boys, or the terminally cranky and hard to satisfy, or the desirous to please. 

And that too is another benefit of testing. Though a test almost always involves small numbers, you do know what every one of those users thinks or would do – even if they would never respond to a survey or click on give feedback.

A final issue with caring is what I would call a threshold issue. In other words, with a VOC system, you’re probably going to get a lot of highs and lows. If, however, something was not a huge disaster – or, alternatively, a life-changing experience – it’s probably not worth reporting. 

In fact, you might well run into the death-by-a-1000-cuts syndrome. Just imagine several million users who have a couple of lower-level issues every time they log in to your system, but never enough to actually complain about. Now, imagine another similar system that comes along and doesn’t have any of those low-level issues. Imagine, further, that all those users leave you for that system overnight. What would you then have in hand that would give you any idea why that happened (or – even better – that something like that might be going to happen in the near future)?

On the opposite end of the spectrum, you can get something akin to Benny Hill's parachutes. In fact, one of my favorite clips of all time was when I was doing a test on a production system. At the end of a particularly trying task, a survey popped up. If I can remember correctly, the user said something along the lines of, "If they think I'm going to fill out their %#$@ survey after that @*#$% experience, they've got another &@^#$ thing coming."

In sum, VOC systems are wonderful, but they can involve their fair share of missed signals and false alarms. To make sure they are more than a glorified suggestion box, it can be helpful to triangulate their findings with other sources of data – web analytics, call center data, and ... even usability testing. 


Benny Hill, famous comedian. ladies man, and usability guru

Tuesday, February 2, 2016

Instructions must die. (Steve Krug)

I started out as a tech writer. I used to write manuals that would tell users how to use computer systems – hundred and hundreds of pages of instructions.

In fact, that’s how I got into usability.  The initial scenario would involve me going back to my developers and telling them that some particular feature took four of five pages to explain. 

“Could we make that a little simpler? What if we moved that bit to the next page, and got rid of this thing here? It might make more sense that way too, right?”

Over the years, developers started bringing me things to look at before I wrote them up. From there, it was a small step for asking for my input upfront, to letting me design a few things on my own – to even doing a little usability testing.

Now, that was a long, long time ago (we’re talking the ‘80s here, folks). It’s kind of strange how that instruction thing is still around though. 

Now, it’s been a long time since I saw – let alone worked on – a manual. What I’m talking about here, though, is something I often see in design or review meetings – basically, kind of a knee-jerk reaction to issues with fields or forms or pages to “just throw some instructions in there.” 

Now those instructions can appear at the top of the page, to the right of a field, under the field, in the field, in help, in FAQs, wherever … The real problem, however, is that nobody ever reads them.

And even if they do, they’re really just one more thing to add to the user’s cognitive load. Why can’t the action just be obvious on its own?  Why do we even need instructions? 

In Don’t Make Me Think, Steve Krug concentrates on boiling the instructions down. There are plenty of instances, though, where doing a little thinking can eliminate instructions altogether. 

My favorite example is probably the date or telephone number or Social Security number fields that won’t accept delimiters (you know, the / or -). Just strip ‘em out on the back end, and you can kiss those instructions goodbye.