Planning ahead in software development – just enough
Big Design Up Front (BDUF) is dead. There may not be too much agreement about any one thing in software development, but the industry has pretty much concluded that BDUF doesn’t work.
As Agile practices and mindset took hold, up-front architecture was essentially looked down upon. And for good reason: in many cases, a system architecture can evolve and be adjusted over time as the system matures. We call this emergent design. (On the far side of the continuum of up-front planning, opposed to BDUF, lies “Just start coding”, which is still practiced quite a bit too, with varying degrees of success as it is more the extreme case, but may be sufficient in simple cases.)
Another mantra circulating in Agile circles is to focus on and start with the easiest thing that can possibly work. This approach allows us to focus on solving the immediate problem at hand while ensuring we are not pursuing an overly complex solution. The theory is that – with the help of emergent design – the implementation can be adjusted down the line.
However, as organizations scaled their development teams and systems grew more complex, problems started to surface. It became problematic for several teams working on the same large solution to let design simply “emerge” in real-time, at the team level. After all, an application architecture is not fully malleable and might be resistant to change. And several teams working on the same application require coordination, agreement on architectural approach and at least some level of planning. A more intentional architecture was needed, without resorting back to the BDUF days. And therein lies the key, which is more art than science: finding the sweet spot between emergent design and intentional architecture (see Agile architecture).
I have written before about ease of use and the key characteristics of applications that are perceived as user-friendly. In this post, I’d like to dive deeper into what I call the “mental model” that underlies applications. I would define the mental model as the intuitive perception a user has or develops about how an application is organized and how it performs its tasks.
The mental model has two key components:
The information architecture: How do the various elements used within an application relate to each other?
The process flow: Which steps does the user of an application have to perform to accomplish a task and what states do the elements of the application move in between? What business rules apply?
The reason I wrote earlier that the user may already have a mental model in mind when starting to use a new application is that none of us is a blank slate. Based on prior experiences, we already have several very specific mental models we’re using or trying to apply when confronted with an unknown application. Let’s look at a few examples of existing mental models we already have:
As I’m using different software products and applications lately, here’s a question I’ve been pondering: what makes an application intuitive and easy to use? (Beyond that, one could ask what makes an app “fun” to use, attractive, addictive, etc., but those are different topics of their own.) Let’s assume an application meets user needs via its built-in functionality, what’s the difference between being perceived as user-friendly vs. hard to use?
I’m not a trained UX or Product Management professional, but based on what I’ve been experiencing, I’d boil it down to this:
Here’s what I mean by those terms:
An application is predictable when the user can…
Understand how its features work – without relying on documentation or “Help”,
Intuitively predict how the application will react to user operations – without specialized background knowledge, and
Reach correct conclusions about how to go about meeting his needs through the provided functionality.
If an application is predictable, it will be perceived as intuitive. More often than not, the way programs are designed makes it hard to predict the functionality and one’s gut reaction may be to just provide help, documentation, tours etc. While these things are never bad, they just work around the core issue, which is inherent lack of predictability. Given the choice, it would be preferable to instead redesign the application, so it can meet the user needs in a predictable manner without requiring those aids.
One of my theories is that many applications get designed by developers, not UX experts. (No offense to developers – I’m one myself and do design my own apps!) Developers think about functionality differently, often approach it from a technical perspective and they inherently know to much of the application and its design already and can’t put themselves in the shoes of a casual or new user without that background knowledge.
Other considerations to enhance predictability are:
Consistent use of UI elements, fonts, color schemes, iconography, screen layout, and naming conventions
High discoverability means that the features and functionality of an application can be easily discovered by the user – again without relying on documentation or help. The user will not be able to leverage and appreciate an application’s functionality if it is difficult for him to discover that it’s actually there. If functionality is hard to find or practically hidden, there’s a risk of user never finding it and getting frustrated because he can’t solve his need.
If an application is well instrumented, i.e. user interactions (e.g. time on page, mouse interactions, scrolling, entry and exit points, etc.) are tracked, logged and available to the application provider, interesting insights can be gained about which parts of the application, features, and controls are used and how often. This information is vital as it depicts how users actually use the application which may be quite different from how the designers intended it. It may also uncover screens or features that are barely ever used, maybe because of lack of discoverability.
The dark side of discoverability, which should obviously be avoided, is overloading the UI with menus, buttons, and controls, which could overwhelm the user. The art is exposing functionality in a thoughtful manner. One technique that could help may be to create layers of functionality, i.e. primary functions, which are easily visible and accessible directly from the UI, and secondary functions, which are still easily accessible, but maybe only via other UI elements (instead of being exposed directly). Example: Microsoft Word allows me to quickly change a paragraph style via a single click of a primary UI element while inserting a footnote requires a trip to the menu. Once again, a UX professional might help provide some much needed skills in this area (and provide a counterweight to us developer types).
Overall, I think more attention needs to be paid to how we design applications. I guess in the end, I’m making a case for either engaging people with the appropriate education and experience or at least being more deliberate about how we design our apps and look at them from the perspective of an outsider, not from the angle of a technical insider. More advanced practices like instrumentation & application analytics, user research, user observations, interviews etc. will certainly also be invaluable in enhancing our applications’ predictability and discoverability.
Subscribe to our mailing list and get interesting stuff and updates to your email inbox.
Thank you for subscribing.
Something went wrong.
We respect your privacy and take protecting it seriously.