Great idea, bad example

So a buddy of mine at work (thanks Mike!) sent me this cool link about how to use real math to measure the complexity of user interfaces. I read the article (feel free to go read it now) and I loved the idea of quantitatively measuring the effort required to use a UI to determine how simple it is.

While I think Aza is on the right track, I’d like to point out how the devil is in the details and how hard it is to map theory to practice.

In the example, Aza compares setting the time on a digital watch using buttons vs. a traditional analog watch crown. The key part is here:

Let’s start by figuring out how much information is minimally needed. For simplicity, let’s assume we are only setting time to the minute (so no need to worry about seconds) and that we are ignoring AM/PM considerations.

There are sixty minutes in an hour and twelve hours on a clock. Thus there are 60*12, or 720 possible times to which we can set a watch. 720 is roughly 2^9.5. Therefore, we know that setting the time requires a minimum of 9.5 bits of information.

And then later:

There are two possible actions in setting an analogue watch: choosing whether the watch is in the time-setting mode, and turning the crown to actually set it. The first action represents 1 bit of information (the crown can either be pushed in or pulled out), and the second action represents 9.5 bits of information (there are 720 possible times to which the watch can be set).

The logic error is that just because the analog watch stores 9.5 bits of information, doesn’t mean the optimal analog input (or otherwise) the human enters equals 9.5 bits of data.

The problem is that there are 720 possible positions of the big/little hand on the face of the watch, and so the average number of position changes necessary to set the watch is 720/2 = 360. Now depending on the implementation of the crown, how you input 360 position changes might vary and there’s no suggestion of how any of them might map to the ideal 9.5 “inputs”. We have no idea if a single twist == 1 hour or 1 minute. Also many newer analog watches detect patterns so that turning the crown quickly advances an hour, but doing it slowly advances only a single minute per click.

What’s more interesting, is that the 720 possible values can be stored in 10 bits. So you could use 10 dip switches to set the time and you’d only require an average of 5 dip switch changes to set the time- which is lower then Aza’s “optimal” calculation of 9.5. Obviously that would be very efficient, but not very user friendly so the two don’t always go hand in hand.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.