Traditional requirements gathering tends to focus on the data inputs, business requirements and data outputs. Obviously, in order to have a system that functions correctly and reliably, these needs must be met. However, in traditional software development, the user interface is typically thought to be something that we can change as we learn about the application and the application usage. What if we changed the direction of the design and started with the user interface?
Now, some people may argue that “use cases” have been created for many projects throughout the years. However, use cases have tended to be more of a process description and not how the users expect to do their work. In the agile world, user stories may come a little closer to what I am talking about but they do not get into what the user interface should look like. They do provide a very concise description of what the system should do, and in many cases are more appropriate than traditional use cases.
You may be wondering what brought this idea about. As a software engineer, I wrestle with requirements, implementing those requirements, and changing implementations based on how the users really want to work. So, the whole software development lifecycle is almost always on my mind. Recently, a few blog posts helped formulate some ideas that have been in my head for a while. First, Seth Godin asked what’s the use case? He started that post complaining about the process of working with an architect and proposed an idea that should work:
The most effective way to sell the execution of an idea is to describe the use case first. After the use case is agreed on, then feel free to share your sketches, brainstorms and mockups. At that point, the only question is, “does this execution support the use case we agreed on?”
The same idea applies to software engineering in the form of prototypes. When you want to show users what a new application will work like, sometimes you need to make the user interface work without the server working. Prototyping was very popular in the late 1990s with tools like VisualBasic and PowerBuilder enabling quick creation of user interfaces. As those tools fell out of favor, prototyping declined in popularity as well. In the past few years, advanced user interfaces have become easier to create again, but prototypes have not risen in popularity as much.
Part of this could be due to the rise of agile development which helps developers make changes throughout the life of a project. Many agile methods promote full implementation of a feature through user stories. Part of this implementation is the user interface for the feature. This means that instead of needing a prototype, features are added to the user interface as they are implemented. If the user interface does not meet the user’s standards or expectations, developers can quickly make changes to the interface. The main problem with this concept is that the user interface is still only seen as a way to exercise the feature being developed.
There are agile methods that focus more on the user interface, like Behavior Driven Development. I am not entirely familiar with BDD, but it does sounds similar to what I am looking for. As an example, the concept of BDD is fairly well described by the following snippet take from Wikipedia:
BDD is driven by business value; that is, the benefit to the business which accrues once the application is in production. The only way in which this benefit can be realized is through the user interface(s) to the application, usually (but not always) a GUI.
In the same way, each piece of code, starting with the UI, can be considered a stakeholder of the other modules of code which it uses. Each element of code provides some aspect of behavior which, in collaboration with the other elements, provides the application behavior.
The first piece of production code that BDD developers implement is the UI. Developers can then benefit from quick feedback as to whether the UI looks and behaves appropriately.
By considering the UI as a stakeholder of the modules it interacts with, you make the UI a central figure in any discussion about the application architecture. However, from what I have read, the concept of refactoring is not considered when dealing with the user interface. The user interface, like any other part of an application, needs to be reviewed in a holistic manner. When people talk about the design or architecture of a system, they normally look at a big picture of a class diagram or an interaction diagram or some other pretty picture. These types of discussion do not occur in the same manner for the user interface. Typically, people only talk about the user interface when it becomes difficult to use.
Why not focus on the user interface more, so that you can make the user’s job easier? As a software engineer, we are always looking for ways to make our jobs easier, so why not do the same for our users? In many cases, the development tasks are split between server-side coders and web developers. Then we let the web developer use his best judgement in designing the entire user experience. Web developers know what the current standards are as well as where commonly used components work best. However, they are not experts in how the users need to do their job.
We need to be gathering user experience requirements, and refining those requirements throughout the life of the project. We need to be refactoring user interface features just like we refactor server side code. If users cannot properly use the system, is it really user error or did the developers just make the system too hard to use? If we focus on the user experience, maybe we can be rid of yet another acronym, PEBKAC.