Developing software for people with disabilities

2012-06-28 work software-engineering disability natural-user-interfaces usability

This article was written a long time ago. It may contain some broad generalizations that are not accurate.

My current internship at LIFEtool focuses on finding ways to utilize the Kinect as an input method for people with motoric disabilities. So far, I’ve been experimenting mainly with body poses and touching stuff (like a virtual button) as ways to express an intention.

When I’m talking about people with motoric disabilities, this usually means that they use a wheelchair and can only move parts of their body. Their degree of freedom of movement and the accuracy varies strongly from one to the other. While some can use both arms freely and accurately, others need lots of strength to move their hand for just a few centimeters.

I had never really had anything to to with people with disabilities before, so this was a journey into the great unknown. For the first milestone prototype, I tried to anticipate the needs of the software’s audience as well as possible, based on second-hand information. I built a pretty accurate body pose recognition system, tested it thoroughly on myself and felt prepared for the first field test.

Facing reality🔗

On the day of the first field test, my high expectations were shattered. We took the prototype to a local tech workshop for people with disabilities, set up the testing environment, and asked some of the workers to play a game I had hooked up to the body pose recognition system.

As it turns out, wheelchairs interfere significantly with the skeleton tracking software, which messed up the accuracy of the joint coordinates. The unstable movements of the testers increased the amount of error even more, making it even harder to detect body poses based on joint coordinates. The system did have a tolerance mechanism, which was calibrated individually for each user, but it couldn’t handle deviations on this scale without increasing the number of falsely detected poses.

Why people with disabilities are great testers🔗

Now imagine your usual test session with your average user. To find out their honest opinion about the product, you would probably have to resort to the information extraction methods of the Spanish Inquisition. Good testers are hard to get these days, everybody is far to nice and worked up about the developer’s feelings.

My testers are better.

My testers are honest. If something’s wrong, they tell you. If they like your product, they start negotiating the price. If the don’t, say “This piece of crap doesn’t work at all. I will never use it again.”, and drive away. Doesn’t matter, that the guy who spent the majority of his allocable time for the last month to make this thing is standing right there.

And instead of being pissed, I’m grateful. This ragingly disappointed user kept me on track and motivated, to improve the prototype, so he can use it too. In case you’re wondering, we managed to convince him to give it another try at the next test session, and he was thrilled about how well it worked, compared to the first time around.

Moral of the story🔗

  • Test with actual users.
  • Test early and often.
  • Encourage honest feedback.
  • People with disabilities are awesome testers!
  • It is not ethically responsible to outsource testing to the Aperture Science Computer-Aided Enrichment Center.