Read Write Web has an interesting article on the concept of the contextual user interface. A contextual user interface – as the name implies – is an interface which adapts to the current wishes of its users, the context. The interface will change according to the actions the user takes; present a set of minimal options, and show other options as the user goes along. While the article makes some good points, it also contains some generalisations that I find rather debatable.The article explains that until not too long ago, we were living in a world of Windows-dominated user interfaces, which presented a standard set of user interface elements (widgets). According to the author, every application was full of these widgets, “and nothing else”. He claims that the user interface world was not one to be innovative in.
User interface was not the place to be innovative. It was considered unorthodox and even dangerous to present the interface in non-standard ways because everyone believed that users were, to be frank, stupid, and wouldn’t want to deal with anything other than what they were used to.
He continues by saying that the recent wave of user interface innovation is proving that the users-are-stupid train of thought is losing speed. “Thanks to Apple, we have seen a liberating movement towards simplistic, contextual interfaces.”
The problem with this train of thought is that it blindly assumes that these contextual user interfaces are, by definition, superior. And I’m not so sure of that, because these contextual user interfaces pose their own problems. Whether you present all your options up front, or whether you hide them, it’s both not an ideal solution.
Present your user with all options under the sun, and they’re overwhelmed. Present your users with only the basic of functionality, and hide the rest away in menus and dialogs – or not present them at all – and you’re bound to annoy certain users who are seeing their pet features tucked away in some far away place, or worse yet, axed altogether, all in the name of simplicity. Especially GNOME, for instance, has had some serious problems with its users concerning the removing of features, but even Mac OS X itself, seen as some sort of UI guidebook from heaven by the author of the article, suffers from this problem. Discoverability is just as much a problem in Mac OS X as it is in Windows – being overwhelmed, or underwhelmed – both can lead to “where the heck is my feature?”
For a small application such as an audio player or note pad application it’s easy to only show the most used features – but go to anything more complicated, and you are sure to hit problems at some point. Something like Word or PowerPoint has ten million billion different features, and millions and millions of users – how on earth are you going to determine which of those ten billion million features are the ones you want to show by default, and which are the ones you wish to hide? A common saying thrown around on the internet is that 90% of the people use only 10% of Word’s features – but the problem is that those 10% are different for each individual user. So, what to do you present as the few default options, and which do you hide only to be revealed upon user request?
The other extreme isn’t much better either, of course, as the article explains. Just presenting every possible feature directly to the user will only confuse him or her. Microsoft tried to remedy this with the ribbon interface in Office 2007, but despite the fact that I like it, the resistance to it has been fairly vocal.
The issue here is that despite its obvious advantages, the contextual user interface is, sadly, not the holy grail of user interface design. For simple applications, yes, it will probably work – but write a contextual user interface for an application that’s a bit more feature laden and the contextual user interface itself can become highly frustrating.
To me, the contextual user interface is a very welcome addition – but it’s not the silver bullet for UI confusion. As always, moderation and balance are key.
Isn’t the ribbon in Office supposed to be a context-sensative thing?
I don’t like that it isn’t easy to get back to other tools if the list of options keeps changing.
Conceptual, or Contextual? You keep on switching between the words in what looks like some phonetic freudian slip (because they make no sense if treated as different words).
In terms of interface complexity, there is a major divide between applications designed for consuming content and those designed for producing it.
In my opinion, content consumption applications like media players and just about everything on the web lend themselves to simplified and contextual interfaces.
On the other hand, content production applications, like Visual Studio, Word and Maya, will always have more complex interfaces because of the level of functionality they must provide.
I think there is certainly room to bring contextual interfaces to content production applications, but it takes a very keen sense of design to do it correctly.
Thom, I’m sure you’ve probably read the Why the New UI? series on the office blogs. It’s a fantastic read discussing the rationale for the new Office 2007 UI, and the difficulties in presenting features. It also solidly crushes the 90% myth with real data.
One of the best series of articles about UI design in the wild that I’ve read. http://blogs.msdn.com/jensenh/archive/tags/Why+the+New+UI_3F00_/def…
And may I add to this an hour long video of how the ribbon was designed with tons of wacky prototypes, and backed up the up with user-testing that proved the shortcomings of the old interface. This is real must to watch if you’ve any interest in the process of UI design.
http://blogs.msdn.com/jensenh/archive/2008/03/12/the-story-of-the-r…
What a load of palaver. When will naive Apple fanboys cease spouting dubious usability and computer history notions, as if they are facts.
From the article: “But there is another fairly recent innovation, which might have just as profound implications. We’re speaking of the contextual user interface… Thanks to Apple, we have seen a liberating movement towards simplistic, contextual interfaces”
Contextual interfaces intended for consumers are not recent, and they proliferated long before Apple Computer existed. For example, by the late 1960s, bank ATMs gave the customer a set of choices, and, after a choice was made, gave the customer a new context of choices. Hence, early ATMs featured a contextual interface: http://www.thocp.net/hardware/atm.htm ATMs were everywhere by the late 1970s, when Apple Computer started.
Contextual menus certainly must have appeared in the computer world before they appeared in bank ATMs. Such menus even existed early on in Microsoft products: in the DOS installation, and in early Windows “Wizards” with contextual, step-by-step interfaces.
In addition, menus with changing contexts appeared long ago in various other devices: vending machines (the first menu queries selection, then the next menu queries to “give change or make another selection”), early video games, juke boxes, etc.
Furthermore, contextual UIs are not limited to machines with screens. Any early, phone answering system that offered primary and secondary sets of choices constitutes a contextual UI. The same goes for any early board game in which the player lands on a square and is then given options on the next move.
The article describes the earlier Windows interface: “Every imaginable choice was thrown at users at once and it was up to the poor user to figure out what to do.”
How does the writer come to this conclusion? The only thing that he gives to support this assertion is a partial screenshot of what looks like Microsoft Word with every toolbar displayed. How many times has anyone seen Word (or any other Windows app) deployed with all toolbars displayed? This can’t be the default set-up — some user must have turned on all those toolbars.
Even if it were the default set-up, one would have to be a complete tard to not to eventually realize how to turn off the toolbars. Also, the meaning of most of the buttons are self-evident, and most of the button icons are used in multiple OSs.
Windows applications have never really been more or less complicated than those in any other OS. They made extensive use contextual dialog boxes. In addition, some applications were/are multi-platform.
And, again, the Windows wizards always gave simple contextual menus, one-at-a-time.
The article continues its Windows description: “To cram more information onto the screen, the interfaces of that era used tabs. At some point Microsoft invented the ultimate UI element – a tab with a scroll button in the end which allowed the user to page through hidden tabs.”
First of all, how does the author know that Microsoft invented the tab scroll button/bar? He doesn’t supply any proof nor any reference to support this claim.
Secondly, usability-wise, what is wrong with having a scroll button/bar for tab overflow?
Thirdly, the Mac way of handling tab overflow seems to be having a 2D, “double arrow” within the tab that is on the far right: http://i25.tinypic.com/hv59xz.jpg How is the Mac method better, usability-wise? Also, it seems that, in the case of Safari, the arrow system does not allow one to drag the overflow tabs — a hindrance.
The article continues in the next paragraph: “Another philosophy of the old UI approach was that the user wants to see all information all the time.”
The author is repeating the point he made in his previous paragraph. So, again, how does he come to this conclusion?
“… there was a myth spread that users were stupid and would not be able to understand a non-standard UI.”
What? Where did that myth originate? Has anyone else heard this myth? Maybe it was spread by Mac fanboys.
“The myth was supported by the fact that a lot of people do not respond well to sophisticated visualizations, like graphs, heat maps, or treemaps. While this is true, it doesn’t mean that people can not figure out new user interfaces.”
Huh? What does he mean by “people do not respond well”? Sounds vague. No support, but the author must have done extensive field testing on his own.
Graphs and charts are supposed make it easier for people to visualize data. However, what do such visual aids have to do with a computer GUI?
Actually, treemaps do help considerably in the usability of complex devices, such as computers. For instace, a user who understands the directory structure of a partition(s) will have a much easier time navigating and finding files in several situations.
Not emphasizing to users this basic directory tree concept could be the biggest usability blunder ever (by both Apple and Microsoft), responsible for generations of helpless computer users.
“The proof comes from Apple, which continuously innovates with new UIs for its software products.”
What has Apple innovated in regards to the computer GUI, other than the trash can and, perhaps, Expose?
“Another important breakthrough in the contextual UI approach is the realization that function is more important than design.”
I think he means that function is more important than style. Function is integral to the design (hopefully).
“The famous Apple mantra that design is the function is true, but not everyone can design like Apple.”
Meaning “style-is-the-function” rules Apple’s design, hence we have items like the round mouse, monitors that can’t tilt down, default jelly-blob window buttons having minuscule click-able areas and lacking intuitive symbols, etc. So, it seems to be true — not everyone can design like Apple.
Earlier in the article, the author suggests the philosophy behind the Windows GUI: “User interface was not the place to be innovative. It was considered unorthodox and even dangerous to present the interface in non-standard ways because everyone believed that users were, to be frank, stupid, and wouldn’t want to deal with anything other than what they were used to.”
This GUI philosophy sounds more like that of current Apple fanboys than that of any other computer user — fans of Apple are always dogmatic about how consistency is so important to usability and about how important it is to “dumb-down” the GUI. Even the author showers praise on Apple for dumbing-down the Mac GUI. If Apple didn’t believe “that users were stupid,” why would they need to dumb-down their GUI?
I should have stopped reading at the “Thanks to Apple…” line. Can’t read any more. This article is a bilious, Apple-adoring mess.
Edited 2008-05-16 12:20 UTC
Well, one thing is clear. Everyone is either a Hater or a Fanboy! I’ll leave it as an exercise for the reader to figure out which this poster is
One thing is for sure, the iPhone UI has made a huge impact. Whether it is a conceptually new interface, or just one applied in a new way, it has a higher satisfaction rating than any other phone on the market. Several surveys have found that to be the case.
So go ahead and hate the iPhone, but don’t try to claim its UI is not a significant milestone.
Right. Why respond to a post, when you can just pigenohole the poster?
It’s clear to me this guy doesn’t know what the hell he’s talking about. Microsoft has been trying to move their UI to a fully contextual (or task-based approach far longer than OS X has been around). The task areas to the left of the folder/file view in 2000/XP is one approach Microsoft has unleashed to the mainstream UI. The ribbon in Office 2007 (which I like a lot) is another. Vista changed the task approach they started in XP, likely due to user feedback given the amount of real estate it used, and have converted the toolbar in Explorer to a task-based contextual toolbar. Select a removable drive and eject, autoplay, burn to disc (amongst others) will appear as options. Select a folder and share, burn and open will appear. I don’t think this one will last long either since it’s too subtle but to say Microsoft doesn’t do contextual UIs is ignorant at best.
To think OS X does things perfectly is ignorant as well. Sure, Microsoft offers dialog boxes such as the folder view depicted in the article but that’s because Microsoft offers users the ability to customize their experience as much as possible while Apple tells the user you get this and will like it. Before you call me a fanboi, I have two Vista boxes and a Leopard box, plus I use Tiger at work (graphic designer). One example that’s bothered me ever since I got my Leopard box was a change in Mail Apple made between v2 and v3 I believe. You used to be able to set Mail to a three column view (like Outlook). For some odd reason Apple removed this feature and you’re left with the top/bottom split view only. On larger widescreen boxes the three column view makes much better sense. Why eliminate an option such as this? Note also the lack of options available to the user. Apple sets the UI and you’ll like what they give you or else. I’d like to set Safari to open most new windows an a new tab instead of a new window altogether. Safari give me VERY basic tab controls (do you want to use them or not) and not much else. Sure, I like the look of OS X’s UI and I like that they offer configurability in other areas but let’s not confuse this with superiority.
This article seems to look at how bad Microsoft has handled self-configuring menus and then tar and feather all possible versions with the same brush.
Because of my Mom I have been forced to use Windows ME version of this idea, and while the implementation is a royal pain just a little thought makes it clear that this could be made so much easier to use, and the simplified menus would indeed be faster to use than ones that show everything.
Problems and Solution:
Number of visible menu items. In WinME it seems to be limited to the five(5) most used items in a menu. Give me an option to raise it to ten(10) and I would be a happy camper. Others will have diffirent numbers, and yes I would like global as well as per menu column options.
Ordering. At present the order of the items display is still the same order they are found in on the full menu. There are cases where I would like control of that order on the shorten menus. The original choice makes sense 90% of the time, but if you really use a single option alot moving it to the top (manually) would help a lot too.
Submenus. At present submenu options do not appear directly in the shorten menu. Again there are limited cases where a manual option to do this would be a big help in using some programs.
This is one of the worst articles I have read in a long time. The suggestions of the author seem to violate every UI standard. He seems to want ugly, inconsistent, and nonintuitive gadgets, and wants to hide options from the user as though they are stupid. Maybe his are.
“The elements of the menu are not buttons but check boxes, which allow multiple selection – another
violation of classic user interface elements”
Rubbish. I don’t know or care whether Windoze or Linux or his beloved MacOS have multiple menu selection but it is certainly supported in AmigaOS and has been since 1985.
He should read the “Amiga User Interface Style Guide”, it’s still the definitive reference source for GUI design: a classic. Maybe then he would not be putting his foot in his mouth with this crap.
Then again, since he is obviously an Apple shill, who knows.
I really HATE articles that are sooo full of personal opinion but try to sound like they are only delivering facts.
And I also didn’t like the “Thanks to Apple…” sentence, because in my opinion, the Mac OS GUI isn’t really different from other window-based GUIs. I really don’t know why they should be the “GUI masters sent from heaven”, since they’re also using windows, checkboxes, buttons, scrollbars, what is so special about it?
I thought you were going to talk about DIFFERENT ways of implementing GUIs, and I thought there might be really revolutional aspects in the article, but since you’re praising the Apple GUIs so much, I can’t really see any revolution.