At Xceptance we think that “you live and you learn”, so we were eager to read about an upcoming training week by SQE in Boston (March 24-28). The various tracks had promising headlines like ‘How to Break Software: Robustness Testing Unleashed’ or “Testing Under Pressure” but we decided to pick our favorites and chose “Exploring Usability Testing” and “Mobile Application Testing”.
The usability training was scheduled for just one day, the mobile training for two. We knew that it would be tricky to fit an overview of usability testing in just one day of training, and it turned out the instructor was well aware of these concerns.
However, first came the Mobile Application Testing. We were a class of about 12 people, most of whom came in groups of two and worked for the same company. Our presenter introduced himself as a software security specialist, followed by the usual round of introductions of the participants. The majority of us already had a good knowledge of software testing in general, so we skipped that part and went further into our main topic. We were asked to open a site on our mobile devices that we had only opened from our laptop or PC so far, and to compare the sites and share the results with the class. This brought up some interesting and surprising insights; clearly there are companies out there that need mobile QA!
One of the next tasks we were given was to make up our own app, including the purpose and components. We used those apps in the course of the day for other tasks as well, so this was a key exercise. Our instructor gave us a lot of other exercises that day and the next, together with more information about mobile testing platforms, testing strategies, techniques and tools. Among the topics we learned about were approaches to mobile testing in general, and all the things you wouldn’t think of when testing desktop sites but that can make a difference for mobile applications, such as shaking the device, checking the impact of different networks, and varying screen resolutions among devices. We also discussed mobile testing platforms like SauceLabs and discussed the pros and cons of emulators and desktop browsers for mobile testing. All in all these two days of training were a valuable experience and provided new and useful ideas of how to tackle mobile application testing.
The usability class was a similar size, and all the other students came in groups of 2 or 3 from very large, well-known companies. In fact, as we introduced ourselves, it turned out the instructor had worked with every one of them, so only Xceptance was a new name to him. As already pointed out, one day was really too short, but the instructor did a remarkably good job of pulling out the highlights so that those participants new to the topic could walk away with a better idea of what “usability” is, while also touching on more in-depth aspects and providing us with additional resources so we could continue learning after the day was over.
First of all we talked about what “usability” even means; a helpful trio of “-bilities” comes from the ISO 9126 Quality Standard: learnability, understandability, and operability (since revised somewhat in the new ISO 25010, but still useful). But that’s only part of the battle: those are somewhat subjective impressions, and one person’s “understandable” could easily be another’s “incomprehensible”. Determining who a product’s users are, what they do with it, and in what context they do it is also critical. With those user types in mind, scenario-based testing based on “typical, real usage scenarios” help uncover the disconnects between what a product’s users need or expect it to do and what it actually does.
A discussion of usability heuristics, based on Jakob Nielsen’s work, followed; these are a very handy scaffold to build tests around. Finally, the instructor gave us an overview of how to organize and carry out a usability test, with some videos of actual tests he had done. The course material provided gave more information on other major areas, such as mobile, accessibility and safety testing as they relate to usability, but we didn’t have time to discuss any of these during the training itself. But as short as the class was, getting new ideas about how to describe, categorize, and prioritize usability issues was a valuable experience, and will help in advocating for bugs in future projects!