NISC Usability Team is 'Building a Better Mousetrap' - National Information Solutions Cooperative (NISC)

NISC Usability Team is ‘Building a Better Mousetrap’

Every team at NISC is dedicated to producing the best possible software we can to meet the needs of the NISC Membership. Sometimes, though, it’s beneficial to take a step back and remember exactly which groups our software is designed to support.

“We know (the software) inside and out and it makes perfect sense in our mind,” said Laura Matthews, NISC Application Design Analyst. “However, we have to keep in mind that we’re not the user. We can’t un-know what we know about how the software works.”

What Matthews described is essentially the basic tenant of usability; designing a product with the user in mind to meet their needs and demands.

Since the birth of NISC, the iVUE® platform has been designed to meet the needs of the NISC Membership, with input from Members through various channels like Joint Application Development (JAD) sessions and NISC Member Advisory Committees. Usability, though, takes that one step further.

“Usability, in a nutshell, is user advocacy,” Matthews said. “It encompasses research, design, efficiency and effectiveness of the products. It’s taking and trying to understand the users themselves. It goes over and above what we typically use in our software processes, which are business requirements.”

Understanding the business challenges of NISC’s Membership is essential to creating and building the best software possible. Add to that the ability to understand how the user interacts with the software, and it becomes even better, helping NISC Members reach business objectives through non-business means.

“In a way, it’s a form of product and market research, so we’re better streamlining (our solutions) for our Member/Owners,” said Tom Pallesen, Senior Usability Analyst. “The more that (the solutions are) in tune with their objectives, the more efficient they are, the more profitable they are.”

The bulk of a usability study rests with comprehensive testing. This isn’t just sitting a user down in front of a screen and having them interact with a piece of software. Users are provided a specific set of tasks and asked to complete them, then graded on how well they are able to complete the given tasks. No, the grade doesn’t reflect poorly on the user, but helps the team understand where challenges exist within the software.

“Testing is a huge part of usability, so we plan to have a lot of that down the road,” said Ashley Moore, Usability Analyst. “Testing internally, we had our thoughts where people would have trouble, and some of those were confirmed, and we found a lot of new things that are potential problems down the road. So, that’s been really good for us so far.”

Testing, collecting data, analyzing that data and producing the findings is the bulk of the work of usability testing. However, that data then has to be put into action.

“We’re gathering all the information about what the user needs, but in time we have to make the better mousetrap,” Pallesen said. “The recruiting, testing, analysis, that’s the first 2/3 of the process. The back side is, now that we’ve got this (data), how do we address the issues we’ve discovered. Otherwise, why do the other two-thirds?”