Thursday, December 16, 2010

Productivity

The head of my engineering department had a meeting yesterday to talk about getting from requirements to shipped product. He talked about our need to become more productive and more efficient--not because we were dogging it, but because the market place is getting more competitive. He made a couple of points that had the same clarifying effect you get when you've been knocking about in a dark room and you finally turn on the light. That "aha!, that's what I've been barking my shins on" kind of moment.

He defined just one metric for assessing the productivity of an engineering department: $/E
$ = revenue
E = number of engineering employees

The point is that any time you are exerting any kind of effort, you must ask "Is this adding value that someone will pay for?"

He also talked about efficiency, and he pointed out that there are only two ways to improve efficiency:
  • Add more value for the same amount of work.
  • Do less work for the same amount of value.
The second bullet leads to such questions as "Do we need  a 90-page PRD to build this?" and "How much detail does the programmer need in the wireframe to know what the UI needs to do?"

He did not give the following sobering example, but it is food for thought along these same lines as we go into the new year.

Let's say that your product has a profit margin of 10% and let's say an employee costs $100,000 a year.

A company would have to sell $1,000,000 of product to add $100,000 to the bottom line.

Or it could lay off that employee.

I worked for a guy who had been a colonel in the green berets. He used to describe poor performers as "So and so isn't worth their rations."

$/E

Tuesday, December 07, 2010

Top four misunderstood expressions

There's a great column in UXmatters on the Freemium Model. What I liked most was that the author resurrected B.F. Skinner and reinforcement ratios. I was talking with my wife this weekend about what I consider to be the three most misunderstood expressions, and this column reminded me that B.F. Skinner is a source of a common misunderstanding--so my list has grown to the following four most misunderstood expressions:
  1. "God rest ye merry, gentlemen, let nothing you dismay." It means, "Hey guys, I hope God keeps you happy, and don't let anything scare you." Putting the direct object "you" in front of the verb "dismay" throws folks--that and the fact they don't hone in on the first comma before "gentlemen."
  2. "Wherefore art thou Romeo?" Means "Why did you (hunky guy I really like) have to turn out to be Romeo--my enemy?" Wherefore means "why" and note the lack of comma before Romeo.
  3. "Suffer the little children." Means "Put up with the kids."
  4. And last, my buddy, B.F. Most people interpret negative reinforcement to mean what B.F. Skinner calls "punishment," i.e., the doing of something unpleasant to make someone stop doing something (like the electrical shocks Bill Murray administers in the lab in Ghost Busters). Actually, negative reinforcement is the removal of something unpleasant to encourage someone to keep doing something. If a teacher cancels weekend homework because a class has had perfect attendance, that is negative reinforcement. The key is the word "reinforcement."

Thursday, December 02, 2010

Designing for the total mobile experience

I've been working on my first "smart phone" project--investigating how our managed security services portal could accommodate smart phone users. If nothing else, it forced me to take the plunge about a month ago and get an iPhone. Up to now I have used phones to call people and take calls from people. At least I was using a cell phone and not one of those things that hang on the wall and you have to crank.


I was lucky to have done an internal presentation about 2 months ago for IBM on the same topic I will be doing at the STC Summit in Sacramento (Designing user assistance for trial demo software), and the speaker right after me was the manager of the IBM Mobile Research Center. Needless to say, I stayed on to hear what he had to say. (Say what you want about large corporations, how many companies have a Mobile Research Center?)

The most important insight I got from his research was that users distribute a task between their smart phone and their workstation. Smart phones are easier to access than workstations and good for monitoring; workstations are better for doing work than smart phones. I know, kind of duh!, but it's led to a different approach to the UX design than I would have taken.

I am writing use-scenarios that envision the total experience:
  • What triggers the user to access our portal from a smart phone?
  • How much of the task needs to be/should be done on the smart phone?
  • How do we gracefully transition the completion of the task to when the user gets back on our portal from his workstation?
It has really helped me stay away from just redesigning pages to look good in constrained real estate. 

And the way cool part is that my wireframing tool, Basalmiq Mockups, has iPhone templates.

 
Oh brave new world :-)

Friday, November 12, 2010

Better than Disneyland!

I don't think there is any way I'm going to be able to tie this blog into user assistance or user experience, so I'm writing it off as "Hey, it's Friday and it's my blog :-)"

Went to a gritty little bar in Atlanta's mid-town last night that has an open bluegrass jam session every Thursday night. I mostly play alone-- three times a year I get together with an old high school friend and play Dobro to mostly what would be called folk songs.

I was nervous! My minimum objective was to find the place and walk in with my guitar. If I only did that, it would be a baby step in the right direction. My other objectives were not to cry and not to throw up.

These folks were good! It turned into a group of about 4 fiddlers, 2 banjos, 2 mandolins, a stand up bass, and a couple of guitars--oh yeah, and a Dobro player, me!

Not only did I meet all my objectives, I actually got my Dobro out and played along. Bluegrass jams are an interesting dynamic. There is a real culture of inclusion. Every player is given an opportunity to solo. Sometimes when I got the nod, I could only shake my head and say "I got nothing." But then there were the times when the lead banjo guy or the guy on bass would say, "Take it, Dobro," that I had something and jumped in.

They called me "Dobro!" A sixty-one year old man should not get giddy, but I gotta tell you, I'm still flying over that!

Playing bluegrass in a group that big, what with its driving rhythms and full sound, had a very physical sensation. It reminded me of when I would go sailing in a brisk wind. There is an awesome sensation of being moved by something that is both soft and powerful at the same time. Air pushing a boat at adrenaline-provoking speed--vibrating strings carrying you in a current of harmonious sound.

I will never be the same.

Thursday, November 11, 2010

A simple productivity tool

A lot of vectors converged this week:
  • Got back from vacation with the usual back-from-vacation-what-is-it-I-do-that-these-folks-pay-me-for fog
  • Recently got a new boss (former boss from earlier position)
  • Some projects slowing down, some simmering under the radar, some seemingly waiting for...oops, waiting for me!
On and off I have used an Excel spreadsheet to track my time against projects, and that has been a useful tool--psychologically, it keeps my nose to the grindstone. But it has some limitiations:
  • I don't share it with anyone--hey some things just need to stay private.
  • It is "time spent" focused, not achievement focused.
So I created a new tool to use: A Google spreadsheet to track project status and to log activities against their respective projects--not time, more of "Held meeting with SME to determine how single sign on works" and the date. Each project has its own tab with the following:
  • Project name
  • Description
  • Status (green, yellow, red)
  • Status description (where the project is currently or why it is yellow or red)
  • A log to record activity and date
I also have a main summary tab that pulls all of the above information except the log and that shades the status cell with the appropriate color (use the "change with rules" option for the background color).

Tip: To pull data from a another sheet in the same file:
  1. Click the cell on the summary page where you want the data to show.
  2. Type an equal sign.
  3. Navigate to the sheet that has the data you want to display.
  4. Click the cell that has the data.
  5. Press Enter.

And since it is a Google doc, I have shared it with my boss.

My new routine is to look over the status summary and the individual tabs each morning.

Here is what I have noticed:
  • I am motivated to do something so I can log an activity. And since neither my boss nor I are stupid, I look for a meaningful activity.
  • When my status indicates I am in a holding pattern waiting for someone/something, I send an email to the party I am dependent on, or I schedule a meeting with them--and log that activity!
  • My focus is on making progress and not on clocking in and out.
I find this particularly helpful in my current environment, which is an Agile shop. I am on several scrum teams and I have some strategic project work I am involved in. My daily scrum reports do not adequately let me reflect on my total contribution. This status spreadsheet does. BTW, the log is great for when I call into my daily scrum meetings, I can see exactly what I have accomplished the previous day for that project.

Friday, October 01, 2010

Productivity: Live by the sword, die by the sword?

I was not only enthralled by an article in the Sept/Oct issue of Intercom (STC's magazine), but I was equally interested in my reaction. I love it; I hate it, wuh?.

The article is Measuring Productivity and is very well written by Pam Swanwick and Juliet Wells Leckenby , two STC members. This is a great article if you are involved in managing technical communication projects. That's the part I LOVE!

But a part of me is concerned that it might focus too much on what we produce and not on what we contribute, in terms of helping a design become more user centered or helping a team build knowledge that it can leverage. What does it do in the case when a writer spends a lot of time helping the developers improve the product but that effort does not get reflected in the writer's output (but does improve the user experience)?


Read the article and share your reactions. I've started a discussion in the STC group in Linkedin if you want to weigh in there. 

Tuesday, September 28, 2010

Zombies, Expertise, and Post-Apocalytic Scenarios

A lot of things kind of converged yesterday. I had lunch with my friend and colleague Miranda Bennett, and she was excited over a new game--lousy UI but engaging premise. Apparently you collect pieces and build a fort-like structure and then zombies come out to get you at night. I'm not a gamer, but I commented that zombies were a popular construct these days and wondered why. Miranda posited it was less about zombies and more about surviving in a post-Apocalyptic world. Hadn't thought about that. Most zombie movies do have that theme.

Miranda went on then to observe that people are being taught that they don't know how to cook (as she nuked her prepackaged lunch). She pointed out that even the simplest dishes, such as baked chicken breast, come prepackaged and pre-prepared. Add to that the other ways technology has embedded expertise into our tools, e.g., spell checker, calculators, etc., and no wonder we view post-Apocalyptic scenarios with horror--we will be helpless. Maybe the zombies are just metaphors for our atrophied brains--that would account for their craving to eat brains.



All this on the same day I received my author's copy of Qualitative Research in Technical Communication, edited by James Conklin and George Hayhoe. At long last, the findings from my doctoral research got published as Chapter 14. In it my major professor, Tom Reeves, and I discuss the role of experts on usability teams. They can streamline the process and even improve the results, but at a cost to team learning. We rely on experts and take them at their word, sometimes deferring our own critical thinking and creativity.

As user assistance tries to take the role of expert (see my article on User Assistance in the Role of Domain Expert in UXmatters) how do we avoid contributing to post-Apocalyptic impotence (PAI--I just made that up)?

I think the answer is simple, expertise should be about transferring insight and not about dictating steps. It should enrich the question so the user can wrap the answer in his or her own context and data. That way the user is enabled and not just directed. See my column in UXmatters Making the Deal: Supporting the Demo with User Assistance for a practical example.

Thursday, September 23, 2010

Odd First Sentences

When I read my title, I was reminded of the Stephen Wright joke, "I got in a terrible fight at the roulette table with the croupier over what I considered to be an odd number." (It occurred to me as I read the title that all 1st sentences are odd--literally.)

OK, so I'm reading an article in American Rifleman (not mine, someone left it laying around at work) and the opening sentence is "It's surprising how many of our most useful and reliable cartridges started in the military."

You gotta wonder what is so surprising about that. I don't know much about guns and ammo, but if you came to me and said you needed something good in that line and asked my opinion about where to go, I'd probably come up with "See the Army." That's what they do, they shoot at people and they try to do it accurately and with great effect. They should know.

About 35 years ago I picked up a copy of my company's employee newsletter, and there was an article profiling one of the employees. Its opening line was "Bill certainly fits the mold of 'he's one of a kind.'" If he's one of a kind, why is there a mold? The sad part is that we were a manufacturing company, you'd think we would understand the concept of a mold.

And now a bit of a bonus, the original line that inspired the Bulwer-Lytton award:
"It was a dark and stormy night; the rain fell in torrents--except at occasional intervals, when it was checked by a violent gust of wind which swept up the streets (for it is in London that our scene lies), rattling along the housetops, and fiercely agitating the scanty flame of the lamps that struggled against the darkness."

 --Edward George Bulwer-Lytton, Paul Clifford (1830)

Note that its claim to infamy has more to do with the length and rambling nature, not the inherent badness of its oft abbreviated version, "It was a dark and stormy night." Had he left it at that, I think it would have been right up there with "Call me Ishmael." Hmmmm. Maybe I'll do a sequel to Moby Dick from the perspective of his girlfriend. Opening line: "Call me, Ishmael." (Apologies to Lynne Truss)

Wednesday, September 22, 2010

Three Myths

When I hear people talk about not getting respect for being a technical communicator (or not getting paid enough) I wonder if the following three myths are holding them back:
  • Thoroughly described = adequately explained
  • Accurate = useful
  • Grammatically correct and correctly punctuated = well said
These are the "writer" myths, based on the belief that the value we bring is our ability to write. Technical communicators should be "sense makers" and "explainers." We should deliver timely insight that enables the user to act more like an expert.

Easy words to pen; hard ones to live up to.

Thursday, September 09, 2010

Dashboarding

I'm reading a lot about dashboards these days. What a fun challenge in technical communication, trying to put ten pounds of information into a one pound UI.

I'm reading Stephen Few's Information Dashboard Design, and he makes the most elegant points:

Reduce the non-data pixels:
  • Eliminate all unnecessary non-data pixels
  • De-emphasize and regularize the non-data pixels that remain
Enhance the data pixels:
  • Eliminate all unnecessary data pixels
  • Highlight the most important data pixels that remain
He also had some interesting things to say about pie charts, as in basically they all suck. His point is that human perception cannot compare 2-D areas as well as other visual attributes such as length. He suggests that bar charts make better comparison graphics than pie charts.

So I'm trying a little experiment on myself. I keep a time sheet in Excel so I can track where my time goes (plus I find that recording my activities makes me way more productive--try it some time). I've been keeping a little pie chart dashboard on the sheet so I can see how my time gets allocated. It seemed useful to me.

So as an experiment, I've added a bar chart of the same data.

Here is the pie:


Here is the bar:


Jury of one is out on this. What's your opinion?

Thursday, September 02, 2010

Menu Blind Spot

I'd write this one off to stupid user (moi) except that I saw it a lot when I was a usability tester. I wanted to change presentation colors in my email client. I was pretty sure I did that in Tools > Preferences. It is important to note that my opening assumption, I repeat, was that I would find it under Tools > Preferences. I clicked on Tools.


I couldn't find Preferences.

After some frustrated head scratching wondering where they would have put it, I saw it.

I would see the same phenomenon in usability tests with drop down menus where the top choice was preselected (reverse video). Humans process a list like this:

Top item set off typographically in some way = column title and is not a choice; therefore ignore.

The reversed video selection became invisible as users would scan the not-highlighted choices under it. Because Preferences is a single entry and is underlined, I think I processed it the same way.

These mental shortcuts usually make us more efficient--but sometimes they get in our way. As designers, try to avoid making the top choice different in any way.

Tuesday, August 31, 2010

Rubifying the Language

My favorite Scrum Master used the word "uniqify" in a sprint planning call today, as in "Could you uniqify that expression?" I poked him through the instant message tool we have because he had recently taken me to task for using the word "epistemology" in a design session.

It turns out that this is a fairly common pattern in Ruby, as in stringify, uniquify, htmlify, etc.

The rule seems to be that the verb [term]ify means to take the ensuing object of the verb and give it the qualities of the [term].

Of course, this has instances in our day-to-day language, that's where all these types of linguistic manipulations have their origin. To "liquify" is to to make the object like liquid. Its opposite is "solidify." It's very similar to putting "ize" at the end of a term to make a verb--but I sense a subtle difference I haven't yet quite distilled.

As an amateur linguist, I LOVE when a language can do this. Arabic does it on steroids, having a very complex set of patterns that decomposes just about every word down to a tri-literal root. Verb forms take their nuances from the root, as do noun patterns for the doer, the done-to, and the where-done.

English doesn't do it nearly enough. Sure, add and "er" to the end of a verb to get the doer, as in writer, rider, sleeper. Sometimes an "ee" to get the done-to, as in payee. Do we even have a pattern for deriving the noun for where-done from the verb? For example, the Arabic word for school, madresa, comes from a pattern that makes it "the place where studying is made to happen."

HTMLify is my favorite so far. "Can you express this in HTML?" becomes "HTMLify it."

Tonight I'm going to mealify some leftovers. Anyone who thinks that means "reheat" has never seen what I can do for leftovers.

Tuesday, August 24, 2010

Knowledge Life Cycle and Social Media

I usually try to keep work and blog separate, but I gotta say I really like working for IBM and especially the IBM Security Services group. Why? I get paid to have some really interesting conversations.

Last week I was in a meeting where we were discussing the right way to use Lotus Connections in our work team--versus a department Wiki we already have.

Lotus Connections is social network app that has forums, file uploads, activities, wiki, yadda yadda. BTW, it's pretty good. But the discussion was "When do we use that tool versus our more formal department Wiki--where we keep departmental procedures and such?"

The conclusion was actually quite elegant in its simplicity: We will use Lotus Connections to hold the conversation; we will use the Wiki to curate the answer. I'm sure that's not original--but we got there on our own, nonetheless.

I think this is a pattern that has lots of applications where social media has started to get traction. Online user doc vs. user forums, for example. STC SIGs and its Body of Knowledge is another example that I think about a lot and that seems to apply here.

I think this is a field that needs a lot of discussion and open dialogue, specifically, how to manage the life cycle of knowledge from ideation and churn, through vetted "this is how it is," and eventually to "this is last week's dead fish."

Of course, my traditional mistake is to impose formal process and control on what should be left open and organic, but still, I feel that some sort of process or guidelines would be useful. Here's my laundry list of questions. More questions, answers, general trouble-making--all are welcome:
  • What are the stages of knowledge or what are the categories of maturity/credibility, whatever?
  • What other dimensions should go into "whatever" in the question above?
  • Where does knowledge live during these stages?
  • Who moves its classification during its lifecycle?
  • At what point is someone liable for the consequences if knowledge is acted on and proves wrong?
  • When can knowledge be branded as intellectual property in an evolutionary model like this, and who gets to own it?
And I apologize if this is all naive and the solution is fully developed by now. I was very involved with Knowledge Management before social media had the kind of impact it has today.

Friday, August 20, 2010

Mixed message


Reminds me of the Steven Wright joke where he named his dog "Stay" just to confuse him when he called him.

Wednesday, August 18, 2010

The Software Development Death Cycle

Does this look like your development cycle?

All is joy and celebration on the product management side when the project begins. Then comes the iterations as the marketing requirements document is passed back and forth with engineering. (BTW, is it just me or does it seem we lose about 1/3 of the project's useful development time in this phase?) Then engineering goes to work and produces something they send to QA. The product manager sees it and goes postal. Well, been there!

I worked at a place where there was a product manager--let's call him Dave--that kept stirring the pot in QA when he got a glimpse of the UI coming out of development. The design and development team wanted to take out a contract on him. "How do we keep Dave out of the process that late in the game? He's creating too much churn."

I saw the problem differently--why were we disappointing Dave, the guy who brought us the work to begin with? I could understand if we were disappointing the customer, after all , they hadn't written the requirements, Dave had. I could understand if we were disappointing our partners, after all , they hadn't written the requirements, Dave had. But how was it that we were disappointing Dave?

I concluded it was the requirements process itself that was failing. It relied on words, and words were screwing the deal. As a technical communicator, that was a harsh realization, but I have come to learn the following:
  • Gopen and Swan are right when they say "We cannot succeed in making even a single sentence mean one and only one thing; we can only increase the odds that a large majority of readers will tend to interpret our discourse according to our intentions. "
  • Words, then, create the illusion of agreement.
  • And my own realization is that any time words are a problem, more words are never the solution.
So we decided to quit trying to solve the requirements problem by writing better requirements. Instead we moved UI design to the front of the design process and made it a collaborative conversation among the product manager, the UX designers, and the developers. I've since had the opportunity to play with this model and improve it. More importantly, I keep re-validating that it works.

OK, here's how it works.
  1. Start with a list of the requirements, doesn't have to be pretty or even very good. It's a conversation starter.
  2. Sit in a room with a product manager, UX designer, and a developer and start creating a scenario that illustrates a requirement. Make it an explicit example. Who is the user, what's the problem he's trying to solve, imagine how our product would fit in. Tell the story and draw pictures (wireframes).
  3. Do that for all the requirements or at least for the most important.
  4. Have the developers size the solutions.
  5. Let Product Management select from the solutions as much as the development bandwidth will allow.
  6. Go forth and code.
That model can be iterated down to as small a granularity as you want. For example, do it for the top two "we know we gotta have this" and get the dev team coding while you sort through the rest of the requirements.

Here's why this works:
  • The value that project managers bring is that they understand the problem space. Detailed requirements documents tend to end up being solution oriented--takes them out of their sweet spot.
  • Developers design better solutions when you clue them in up front what the problem space is. Treat them like architects and not like carpenters.
  • UX folks can model a product's behavior and validate it with stake holders and users with little to no code (that equates to fast and cheap).
I work in an Agile group now, and if we have a particularly snarky problem, we will put it in a design spike. That is a dev cycle in which we don't put a bunch of engineers on it. We iterate the concepts until product management, development, and UX come to consensus on an approach that is saleable, buildable, and usable.

No big magic, but the core ingredients are not substitutable:
  • Early in the process
  • Collaborative among product management, development, and UX

Tuesday, August 17, 2010

This can only mean one thing...

A while back I blogged that a website I use had been redesigned--seemingly to reach an audience younger than me. Here is what that design looked like:



Well, I went back today and it looks like this:



What made them change? Two theories. One is that they read my blog. Seeing as how I did not make the list of most influential bloggers in Tech Comm I am summarily dismissing that theory.

That leaves the most common reason UIs undergo abrupt and dramatic changes: The president's spouse or parents tried to use the site and complained.

Happens every time.

Friday, August 13, 2010

Good, Better, Best

Someone corrected my use of personas the other day and pointed out that the plural is personae.

File under reason 42.b of "Why people hate technical communicators."

It reminds me of "data."

A good sentence says "The data is clear on this."

The better sentence says "The data are clear on this" because we know that the singular is datum and data is the plural.

The best sentence says "The data is clear on this" because that's how real people talk.

Miller Williams, a former poet laureate, said he wanted to write poetry that cats and dogs could understand.

I don't know about you, but I never met a dog that could relate to personae--not even the ones named Rex.

Monday, August 09, 2010

In defense of my [pejorative] self

Some descriptions seem to carry negative baggage and get thrown at me from time to time. The only problem is that not only do I find these terms NOT pejorative, in fact, I have worked hard to earn them.

One is "writer." I remember sitting in a meeting and having someone voice her concern that several people in the room had referred to themselves as "technical writers." (I was one.) I know the history of this. The Bureau of Labor Statistics (BLS) has a rather outdated definition for technical writer. This person was advocating the title "technical communicator" to differentiate what we do from this outdated definition.

I grew up wanting to be a writer. When asked what I would most like to be, I never answered "a communicator." I think that the role technical writer is a legitimate subset of the profession known as technical communication. Technical writers focus on communicating with words. The problem with the BLS definition was not the term "writer," they just seriously understated what goes into technical writing.

I don't want to undermine a campaign to get technical writers more respect and more pay; I just don't want to have to apologize for what I do, and in fact am pleased to do, i.e., being a technical writer. Sometimes I'm something else; in my current job, for example, I am a user experience architect, another role in the field of technical communication. But when I take on the task of writing user assistance, I'm OK telling folks I'm a technical writer.

Another pejorative is "academic." In its negative sense, it means "irrelevant to real world applicability." In its positive sense, it can mean well studied in the research that has been done in a field and capable of generating valid, reliable knowledge by conducting original research.

I've worked real hard to try to qualify for that latter meaning, so I chafe a little when my desire to apply rigor is branded "academic" and meant to imply "irrelevant."

BTW, I'm sometimes branded pedantic--and that one I deserve and should try to be less of.

Friday, August 06, 2010

Phrases to Avoid

A tweet sent me to a web site that lists phrases to avoid in technical writing. Personally, I find their list to be pretty mild. Things like:

a majority of -- most

a sufficient amount of -- enough

according to our data -- we find

I'd like to add MY list of phrases that should not show up in your documentation:

  • Hey, dick head, ...
  • After you put out the fire...
  • Your browser is as lame as you are.
  • If you figure out what this feature does, let our tech writers know.
  • Some side effects might include...
  • In our defense...

Thursday, August 05, 2010

The bandwidth discussion continued

Yesterday's blog about bandwidth and information attracted some very insightful comments and got me to thinking more about the issue of "Do videos take advantage of their bandwidth?" In other words, are they proportionally better given how much more information they convey?

The ensuing discussion brought a couple of things to mind. I remember from one of my technical communication courses that line drawings are often preferred in a manual (over photographs) because photos have too much detail. Drawings help focus the reader on the detail you want to draw attention to. I find the same principle with low fidelity wire frames over screen prototypes. I think the same can be said sometimes of video--is the fidelity a value-add or a distraction?

Another interesting paradox I noticed is that we tend to assume that videos are good for the neophyte. And Ken Hilburn makes an interesting point about how we instinctively filter out the unnecessary detail of a video. But that is more true for the experienced user than the neophyte. For example, I once tried to teach my mother-in-law how to use Yahoo email. She had a hard time getting through the browser because she thought everything was important. I was constantly saying things like "that's a banner ad, ignore it," or "that's the disclaimer text." We forget how media literate experienced users are, and how adept their filters are.

Where all of this has led me is not to dismiss video, but to approach it with a designer's eye much the way Tufte would have us look at a chart, namely, is each byte of information worth the bandwidth. Another way phrasing the question is "Am I taking advantage of the bandwidth?"

Let's revisit the example of the dobro video. If the purpose were to instruct, then maybe a good design would be to have a synchronized split screen of closeups on the picking hand and the slide hand. That way the bandwidth would be more fully invested in the information of value.

(BTW, please don't take this as being critical of those generous musicians who share on YouTube--I am so grateful for what they do for absolutely free.)

So if you are thinking of doing video, ask what is the information of interest and plan the video to put its bandwidth on that information. Eliminate spurious mouse movements, focus on fields of interest, shade out non-relevant areas of the UI, etc. When we do traditional video, we point and focus the camera. Same mindset for screen captures--don't just sit the "camera" on a tripod and shoot the whole landscape.

Wow, makes me want to wear my director's beret. Hoping for cooler weather soon.

Wednesday, August 04, 2010

Bandwidth and Information

I decided recently that I had "stopped growing musically." You have to be a Catholic flower child from the 60s to inflict that kind of guilt and deprecation on yourself over what is meant to be a hobby--something you do for fun.

(not from the 60s, but you get the picture)

So I hit upon a plan. I found this great dobro player, Martin Gross, who has a terrific YouTube channel. So my plan is to learn one song of his every month. OK, all happy again now that I am "working" at having fun.

As I've been working on "Blues Stay Away from Me" I've made a couple of observations.

The old way of learning a song was to play it on the record player and keep hacking away at it until you figured out how the person was playing it. Essentially, not much has changed except that with YouTube you get the video channel as well as the audio. And honestly, that makes it easier, but not in proportion to the orders of magnitude increase in information that the video makes available. Anyone who's worked on televisions is well aware of the difference in bandwidth between the video signal and the audio. There's just a lot more information in the video signal, and Mother Nature is an exacting accountant (the cost for transmitting information is bandwidth--the more information you are hauling, the wider the highway has to be).

Seriously, if you had to choose between learning a song by just listening to it without seeing the video, or watching it without hearing the audio, you'd be much better off just listening.

Kind of ironic seeing that the video has a ton more information in it. So what gives?

Well, most of the visual information is irrelevant. The color of the guitar, the spacing between the strings, the freckles on Martin's hand, etc. The most important information is what fret he is putting the slide on and what strings he is plucking. Since this is not a split-screen video, those two pieces of information are at opposite ends of the display and it takes a bit of replay sometimes to figure out what he's doing.

It might be true that a picture is worth a thousand words, but apparently a K of sound is worth a Meg of video.

About this time you're checking the header of this blog, thinking it was supposed to be about user experience and user assistance stuff. Well, it made me think about screen cam versus written procedures. Does the same thing apply here?

I think it does. I've felt for a long time that if the only thing we have to say is click this and type that, then a video is not the way to go (and a LOT of software videos are of that variety). Lot of bandwidth for just a little information. I wonder if Tufte's concept of chart junk and data to ink ratio can be applied to useful info/bandwidth analysis. Things like tone and physical manipulation in motion seem to justify the kind of bandwidth that video carries. I don't think of this is as a transmission efficiency issue, no more than Tufte was trying to save ink costs. The human bandwidth and ability to focus is more at issue here.

Plus, it's easier to scan a written procedure to get to that snippet of information I need than it is with a video.

So the point is twofold:
  • Written words are still an incredibly efficient channel for conveying information. Quit beating yourself (or others) up if you consider yourself a writer and that to be your primary channel. "I am technical writer, hear me roar."
  • If you can afford to throw a video or two into the user assistance, do something worthy with that bandwidth.

Thursday, July 29, 2010

First Eyes and Last Eyes

Anyone who is a technical communicator gets involved in reviews, either doing them or getting them. There is an enormous difference in the appropriate level of feedback to give depending on whether you are being asked to look at an initial version (first eyes) or the almost ready for prime time version (last eyes).

Quick example, if I am doing a first eyes review and the document contains the word "utilize" I recommend that the writer say "use." If I am in a last eyes review and the document contains the word "utilize" I make sure it is spelled correctly, or appropriately for British vs US audience.

That example is a bit simplistic, but you get the point. First eyes reviews should look at larger issues and should invite new perspectives, as in "Have you considered taking this other approach?" Last eyes don't add a lot of value with suggestions that essentially would change the scope or direction of a document or user interface right before launch.

In a similar vein, when I submit academic research articles for peer review, I'm always amused by the peer reviewer who suggests that I use a different sampling method or change my interview protocol. Thanks, I'll just hop in my time machine and redo the study. Reviews that suggest different ways to analyze or interpret the data are much more useful. The ones I love most are the ones who help me articulate my points more clearly. Now if I were in a conference room with these folks planning my research, I would have entirely different expectations.

I'm currently wrapping up a project at work where I have been taking a manual written by a partner and basically rebranding and modifying content to reflect how we have implemented their product in our solution. This is very close to a last eyes review (given project time and resource constraints) and I have to be careful not to get out of scope and start rewriting one author's (and company's) style to match mine. It's not always easy. I can make small changes, such as changing "wish" to "want" (translates a LOT differently in certain languages) but I have to ignore some annoying rhetorical differences in how they treat procedures and how we do. (Can you say "doubles the scope?")

Some technical writers balk at this and say it's a question of quality: "I just can't lower my standards." I don't think these writers understand the business of writing, nor are they particularly skilled at critical thinking. By critical thinking I mean being able to discriminate between what is important and what isn't. Running to the high ground of quality sometimes just puts our heads in the clouds.

When asked to review a document or a user interface, we should ask ourselves are we in a first eyes review or a last eyes review. If first eyes, then our review should be critical and ask questions that challenge what could be wrong assumptions ingrained in the entire approach, a la "Is this really the optimal workflow for adding a new user?" Last eyes should assume for better or worse the work represents what the author or designer is trying to do and just make sure that distracting glitches are caught and removed, a la "Password is misspelled."

BTW, the more you treat last eyes reviews like first eyes reviews, the less likely it becomes that you will be invited to do first eyes reviews. Ironic but true.

Thursday, June 24, 2010

Sexy vs. Usable

Whenever I get stumped on a UI, I ask is it a design issue or am I just being stupid? And as I have publicly pointed out in this blog, sometimes I'm just stupid. Got stumped on this one for awhile this morning:


BTW, very pretty dialog box. But the install button was disabled and I couldn't figure out why. Thought something might still be loading in the background so I waited. Finally figured out it was waiting on me...to accept the terms of the license agreement.

I don't think I was stupid on this one. I wasn't seeing the gray box to accept the terms, nor did its label catch my attention alerting me I had an action to complete.

If I could redesign this, I would make the check box white (nothing says empty as well as white) and I'd add a tad of space between the box/label and the paragraph above it.

And I mean it, it is a pretty dialog box, and I should know, I stared at it for thirty seconds.

Wednesday, June 23, 2010

User Adoption: A War with Two Fronts



I know, I ride Rogers' old horse beyond its intended range, but it just stays a useful model for a lot of what I do.

We can identify a point in an acceptance life-cycle with a vertical bar perpendicular to the x axis and somewhere along it. Then essentially we can say that we've got the population to the left of that line on board, and the ones to the right are the resistors we are still trying to win over. So the traditional model in my mind has been "resistance lines up on the right."

But I'm becoming increasingly aware of a negative image to that model, where resistance lines up on the left. Innovators and early adopters will resist efforts to lower the entry threshhold to a technology, preferring to keep the club exclusive. "We had to learn it the hard way, so should they." Or "If you make it too easy, then anyone will be able to [do my job][look as smart as me]."

There are so many examples that I am embarrassed it took me this long to notice it to where I could articulate it. Linux/Unix "We don't need no stinkin' GUI" VCR vs. film, digital camera vs. film, sites like this one vs. hard coding HTML.

This means any user adoption campaign is essentially a war waged on two fronts: Trying to entice the later adopters to come on board while battling resistance from the early adopters to anything that makes it easy for them.

I suspect this problem is most pronounced in non-profit and governmental organizations that are not as driven by the economics of user adoption as commercial enterprises are. I also suspect it is higher in technology communities. No data, just hunches.

Sounds like a good conversation for over beers after your next professional association meeting. Do me a favor and save the napkins for me.

Tuesday, June 22, 2010

New Menu Idea

Just helped a coworker figure out how to reopen his style and formatting palette in Word. He had shut it down because it was getting in his way, and then he needed it back.

That happens to me a lot. It's gotten to the point that I am so reluctant to turn anything off because I'm afraid I'll never figure out how to reactivate it. Well, every problem is the seed for an innovation!



Hey, I want credit if Microsoft uses this!

Thursday, June 10, 2010

Yes, Virginia, there are stupid users.

I just didn't think my wife was sounding diligent enough about looking out for the UPS delivery guy, so I decided to work from home this afternoon so I knew someone would be here to accept delivery of the new guitar.

As I was working in my loft, I periodically checked the UPS tracking site to see if the status changed to indicate it had actually been dispatched. The current status message was a bit vague.

I hit refresh (for the 30th time in 30 minutes) and sure enough the status changed--to Delivered!! That got me a bit anxious, as in "TO WHOM--NOT ME!!!" It said "Garage."

I panicked. They delivered my Mike Auldridge guitar to a garage!!! Then I wondered something, so I went downstairs and opened the door to my garage.

What do you know. A guitar. So much for Mr. Eagle Eye.

Wednesday, June 09, 2010

I am like so old school

You might remember a blog I did last year about how not to update your look and feel. Essentially it says not to let old people (moi) design anything you want to appeal to the up and coming set of users.

I navigated to one of my old familiar sites and it has gone through a revamping by someone who certainly took my advice:


If you are over 60 (doh! moi again) give yourself about 5 minutes to figure out where to log in. Yes I know it says in BIG letters LOGIN and has a BIGASS button that says LOG IN.

It also has smudges for input fields.

I'm not complaining, "Brave new world that has such creatures in it" and all-- just saying I'm feeling like I'm a kazillion years old.

Maybe it needs a Help file that says "Type your password in the Password smudge." That would help geezers like me.

Tuesday, June 08, 2010

Mother and baby doing fine



I feel like I'm sending out a birth notice. This afternoon, Mike Auldridge inspected the latest batch of his MA-6 resophonic guitars (his signature guitar made by Paul Beard Guitars). I'm buying one directly through Mike. After checking them out (he still personally inspects all of his signature guitars) I'm told he said, "This one sounds just like mine," and then he set it aside for me.

Wow!

UPS says it will be here Thursday. Someone's not sleeping for the next couple of nights.

Put Personas to Work

Read my column this month in UXmatters; Personas as User Assistance and Navigation.

Wednesday, June 02, 2010

Requirements vs. Constraints

I love "x" graphs, you know, the ones that show one domain diminishing while another is increasing. They form an x, and the point of intersection represents a sweet spot or break-even point. These days, I feel like I'm living the one shown below:

The more feature-rich a particular design approach is, the more it delights product management. Of course, that starts to overload available engineering resources which drives their delight down. Being a UX designer puts one in this position a lot. On the one hand, you want the product or service to be a differentiator in the market place, one that carries a lot of delight to the customer. On the other hand, it has to be build-able within the constraints of the organization's resources.

So you look for that acceptable area of compromise, somewhere close to the intersection of the two lines. Something achievable that represents enough delight to be a package you can take to market. You end up playing devil's advocate at times, pushing back on product management and goading engineering to stretch. You need to be sensitive to when to back off and say, "OK, I hear you, let me see how I can make the design accommodate that."

In the end, you have to have both sides at the table at the same time, otherwise you find yourself in a series of no-win situations where you are the bearer of the bad news (the areas shown in gray). It also helps the spirit of compromise if each side can be connected to the other's point of pain. Engineering is more willing to bend when they deal with Product Management directly, and Product Management is more willing to compromise when Engineering says "Our schema can't accommodate that kind of a query." Also, each side can hammer out alternatives a lot more efficiently when talking to one another. I'm always impressed how creative engineers can be if you share the problem with them instead of insisting on a particular solution.

This could be one of the most important skill sets a UX designer develops, the arbiter of user requirements and product constraints.

Friday, May 28, 2010

Thanks to TTU

I spent last Thursday immersed with the students and faculty of Texas Tech's online Technical Communication and Rhetoric (TCR) PhD program. Actually, it all started on Wednesday evening with a delightful dinner at Dr. Tommy Barker's home. Tommy is head of STC's Academic SIG and Director of Technical Communication at TTU. Texas-style, nothing was done small or half way. Tommy had even procured a Dobro for me to use so we could do some bluegrass/rock-a-billy picking after dinner. Tommy played a mean acustic guitar and fellow faculty member Ken Baake joined us on banjo. For those in need of a scary thought to haunt you through the day, I have two words: PhDs yodeling.

Thursday morning I listened to doctoral students talk about their research projects, and I gave a keynote talk during lunch on the role of PhDs as practitioners. I spent the afternoon with Dr. Joyce Locke Carter, the Director of Graduate Studies in TCR, sitting in her usability class and touring their usability and multimedia facilities. That evening the students invited me to a barbecue.

I feel like I have seen the future of technical communication, and we are in good hands. The students were engaged in exciting research projects and projected more energy than I have encountered in a long time. The quality of the program is impressive, from the faculty--which reads like a list of academic Who's Who in technical communication--to the caliber of the graduate students (the TCR program accepts only 20% of its applicants).

The students work online most of the year, but spend two weeks working through an intensive "boot-camp" style program every summer. What I find most impressive about the program is the community of scholars this is developing for our field. TTU has worked out an effective formula for combining distance learning with face-to-face networking. And because these students have become accustomed to collaborating online, they will stay connected and influencing each other for the rest of their careers.

Congratulations to TTU for an excellent program, and thanks for the hospitality. That, and a hearty i-e-o-d-lady-hoo.

Thursday, May 20, 2010

Robert's Rules of Order: Essential for UX?

I'm always amazed when information acquired in one context emerges to be useful in an entirely different environment. As a member of the STC Board of Directors, and its current president, I've had to learn a lot about Robert's Rules of Order. I even have my own dog-eared version that I referred to a lot during some tricky proceedings this past year. One would think, what could be more esoteric and useless in the real world of user experience design than parliamentary law? It's not like we aply that kind of formality in our Agile scrum meetings every morning.

"I move we develop a regex to capture the time stamp field."
"I second that."
"Discussion?"

So I'm working these days on trying to design a report format around a particular data security standard. I've spent a lot of time trying to understand the standard and what it requires of users and what it would require of our product. I suddenly realized that my analysis was feeling like the kind of research I did on Robert's Rules. I don't think I could have critically analyzed the standard nearly as effectively had I not had the experience of trying to critically understand Robert's Rules so I could use them effectively to move my agenda forward.

I'm used to getting a lot of value from my involvement as a volunteer with STC; even so, I was pleasantly surprised to see that the legal-like research I had done on parliamentary law paid off in developing skills later useful for researching a data security standard for a technical communication project. It's taught me to be more mindful of what I can take while in the act of giving. And hey, there's nothing wrong with that. The more we let ourselves benefit from volunteering, the more willing we are to volunteer.

Mantra for today: Do good; get smarter.

Monday, May 17, 2010

Happy Monday

Thanks to [unnamed product]'s registration site for giving me the ability to leap through the torn fabric of time and space.


Either that, or their UI developer slept during the lecture on Boolean logic. Or they think I'm quite large.

Also, check out my new motivational poster at http://cheezburger.com/View/3534870016

Thursday, May 13, 2010

Recalculating

We have a Scrum Master (someone who manages an Agile team) who has nerves of steel. No matter what goes wrong, he just recalculates the new path to the solution from the new location.

My hard drive crashed last week in Houston and the Gibson Original Acoustic Instruments factory in Nashville got flooded. They were to ship my new Dobro guitar to me in two weeks.

So there I am in the Sheraton business center trying to recalculate my life. I had once lusted for the Mike Aldridge signature guitar from master luthier Paul Beard's shop. Too pricey at the time. But maybe the Nashville flood was an act of God--literally. After all, I first got interested in Dobro almost 40 years ago when I heard Mike on the radio. I bought his album, Mike Aldridge--Dobro, and I still have it. My wife reminded me the other day that I've had that album longer than I have known her.

So I went to Mike's web site and it listed a number to call to order directly. Well, by direct they meant like Mike Aldridge answers the phone. I was dumbstruck and stuttered and stumbled through something like "OMG, you're my hero, etc." Happy ending, I order new guitar from Mike who will pick it out himself from the batch that will be done last week in June. (Sweetheart of a wife is financing the difference between this one and the other.)



So that part's handled. Now, I just have to restore everything I lost on my hard drive. From here on out, I'm doing everything in the cloud. Kinda goes along with the whole act of God thing.

Monday, April 19, 2010

I have come down from the mountain

I've been very quiet the last several weeks as I have been wrestling with a long-standing inner conflict of mine. I have scoured my soul and I now know my mind. I am ready to make my public statement about...

[ominous fan fare in background]

ending a parenthetical statement with the smiley face icon.

Often, I want to soften a strong statement or add some context to a comment by putting a self-deprecating or explanatory comment with terminal smiley face in parentheses at the end of the sentence (as if my sentences aren't long enough :-))

See!!! See the problem? Is that Mike smiling inside parentheses or is it Mike with a double chin?

I sometimes solve it by using an em dash instead of the parentheses--in this case a double hyphen, but that sometimes offsets the contextualizing comment too much :-)

So I've decided to let the smiley face icon serve as both a smile and a closing parenthesis (thus saving ink in these oh must be so green days :-)

At any rate, problem solved for me.

BTW, went to the Atlanta STC Currents conference this weekend. Wow, I like my chapter.

Friday, April 02, 2010

Afternoon in the Garden of Good and Evil

This week I visited with a ladies' book club in Pine Mountain, Georgia, to discuss my novel, Iron Hoop. It was a classic Southern experience right out of "Midnight in the Garden..." The group were well-to-do retirees who lived in a very upscale neighborhood situated on Piedmont lake. Well dressed, genteel, and with Southern accents that flowed as sweetly and lazily as praline.

Needless to say, they all liked the character Grandmother Tillman--all being grandmothers themselves--and envied the relationship she has in the book with her grandson. The general conclusion was that they needed to tote guns as she did so they would be remembered for being more than just someone who said, "wash you hands" all the time.

The book deals with racism, and we had some interesting discussions about that common aspect of our Southern heritage.

And as many people are, they were interested in the process of creative writing. "Where did you get your ideas?"

It was a marvelous afternoon, sipping wine, having a light dinner, and being the center of attention in a room of gracious Southern ladies.

It's good to be a writer.

Thursday, March 25, 2010

Analysis of a Diagram

Just because you like something you created, it doesn't mean:
  • It's any good
  • You have a big ego
But it can be useful to stop and ponder something you did that you particularly like--so that you can understand your own design priorities a bit better.

I recently created a diagram for an article in UXmatters that I liked:



The article was about the differences in the roles of User Interface (UI) developer and User Experience (UX) designer. I wanted a diagram that showed that each had distinct areas of expertise and that there were areas of overlap as well. Duh, Venn diagram, that's the easy part. My normal instinct would have been to abstract the areas and give them awful nominalizations probably. I decided to use concrete examples instead and to leave the abstraction to the reader. In the article I said:

The area that tends to fall under the exclusive domain of UI development includes the programming skills and knowledge. If you had a pin labeled Ruby on Rails, the UI development role would be a good place to stick it. The area that tends to be the exclusive domain of User Experience relates to user research and usability testing. Thus, if you had a pin labeled card sorting, the UX side of the diagram would be its predictable home. The area of shared expertise between the two roles includes knowledge of UI patterns and standards—the widgets and elements that make up a user interface—as well as knowledge about the software development process.
I like the simplicity of the diagram and for some reason, I especially like the stick pins. I'm reminded of a story about the famous educator John Dewey. He was visiting a classroom once as a superintendent, and the teacher asked the class, "What is the center of the earth composed of?" The students eagerly raised their hands and the teacher called on one. "Igneous rock," came back the answer. Dewey then interrupted and asked, "If I could reach my hand all the way to the center of the earth, what would happen to it?" No one could answer.

My "If you could put a stick pin labeled..." approach seems to have the level of concrete understanding that Dewey was looking for. I like that a complex classification has been explained in terms of a physically familiar task such as putting stickpins on a board.

Doesn't mean it's good.
Doesn't mean I'm being egotistical to say I like it :-)

Thursday, March 18, 2010

Regression Testing for Usability

I just had a bad user experience at my bank's ATM. I'm not blogging to whine, they're a good bank, but I want to understand what went wrong with the experience and more importantly, the design process that led to it.

I drive up to the ATM and insert my card. New screen, bright and shiny oooooooh.

First new thing, it tells me to cover the keypad as I enter my PIN. Hmmmm, Not sure how one does that while sitting in a car, so I pointed to the sky and yelled, "Hey look, it's the Goodyear blimp" hoping to distract any lurker who might be there to steal my PIN.

OK, I got my PIN entered and the next screen asked me what I wanted, and I pressed "Get cash."

Then something happened that's never happened before, it gave me a list of accounts to select from. One said "Savings-123456," another "Visa Platinum-7654321," and then one said "CRWN-987654."

Nothing said "Checking." I figured the 1st one was my savings account and thought the second one was probably my credit card. That left me concluding that CRWN-987654 was my checking. We're talking money here and all of a sudden my ATM is giving me practice questions for the SAT. If all men eat turnips and John is a man, does John eat turnips?

So to be on the safe side, I decide to check the numbers on my card to see if they match 987654. Oops, card is in ATM. Cancel transaction to read numbers off card. Ooops, numbers no longer displayed on screen because I canceled transaction.

I'm in the business so I know what happened here. Product management decided to make my membership more valuable by now allowing me to select from multiple accounts when I withdraw cash. That's a good thing. But in doing so, the product has disrupted my familiar experience--turning a satisfier into a dis-satisfier.

When I was at CheckFree, whenever we introduced an enchancement to our online bill pay, we did what I called Usability Regression Testing. In QA, regression testing is when you make sure that a new feature doesn't break existing functionality. I think you have to do the same thing with usability, make sure that new features do not disrupt the comfort and familiarity of the user's current user experience.

Had they tested it they would have seen the whole number on the card in the ATM vs number on the screen thing.

They would have also figured out that CRWN did not mean "checking" to me. By the way, I checked it out with my wife, and apparently that's the marketing name for our service "Crown Checking." Marketing people are bad about that, they assume we are all in love with their product names and therefore familiar with them. Also, it was a huge screen with only 3 accounts, they could have said "Crown checking account" and I would have been OK.

Designers beware. When we improve the feature set, we run the risk of breaking a comfortable user experience. A little regression testing is always good.

Tuesday, March 16, 2010

Incremental-decremental (excremental)

I just changed cable providers, and this one has the same UI problem as the last. In both cases, if you view the guide (program menu), the channels start at the top of the screen and are listed in incremental order. That is:
001
002
003
...
So if you want to see the next channel after you reach the bottom of the screen, you press the down arrow key. This means you are pressing a DOWN command to go UP in number. For example, to go from seeing channel 7 to channel 8, I press DOWN. No problem, really, because the screen scrolls in the direction I indicate.

But if I am actually watching channel 7 and I want to go to channel 8, I press UP. Of course, I habitually press the DOWN button because my frame of reference is the menu screen.

There is a simple solution: They could list programs on the menu screen starting with the top channels and then decrementing. That way DOWN means DOWN no matter what.

But then you start with the specialty and premium channels on the menu and the not the common choices.

Excrement!

The point is that UI design is snarly stuff and not only must you accommodate user models and technical limitations, but sometimes business rules and market objectives as well.

Thursday, March 04, 2010

Squirrel!



Dug, you gotta love him! But I seem to get involved in discussion threads where it seems a cyber squirrel runs through the conversation and everyone gets distracted. It's a good thing we didn't have discussion groups in the old days.

Churchill: Things are looking rough on the continent, the axis forces are massing to eliminate free civilization as we know it. This could be our darkest hour.
Roosevelt: Wow speaking of dark, the power went out last night and Elinor and I had to scramble for candles.
DeGaulle: I hate that, you never know where you put them.
Churchill: But the Nazis and the Fascists!
Roosevelt: Let them get their own candles.
DeGaulle: We can't provide candles for the whole world.

Wednesday, March 03, 2010

Bleeding Edge

Neil Perlin is looking for participants for the "Beyond the Leading Edge" presentation at the STC Summit in May. Where I stand (click to enlarge):

Click to enlarge.

Friday, February 19, 2010

Curling

The winter Olympics are here and so again is curling. I polish my miniature curling stone paperweight and put on my team USA curling sweat shirt and I'm in my quadrennial state of euphoria.

I discovered it about 10 years ago in Canada watching the women's national final on TV in a pub in Victoria. The teams wore plaid skirts as part of their uniforms. [...] Sorry, I was off in my happy place there for a moment. I'm back.

I worked for a guy who made an interesting metaphor using curling once. We were doing a workshop on employee empowerment and someone asked him what he would do if an employee made a big mistake. His answer was, "I saw a sport called curling last week. Someone slides a stone along the ice and the other team members go along side with brooms trying to influence its direction as it makes its way toward its goal. Unlike bowling, where you throw the ball and then stand back and watch. If someone makes a big mistake bowling, I'll fire them. If they make a big mistake curling, well, we'll learn our lessons and move on."

The other thing I like about curling is it looks like a sport you could play with a beer close by.

Thursday, February 18, 2010

What makes experts expert

In my previous blog I throw some (well-padded) elbows at experts, and Isaac Rabinovitch rightfully takes me to task a bit. He makes the points that sometimes the gnarly explanation is needed--true, see my post on G2G (Geek-to-Geek) communication--and that technical communicators should not cut off experts during their explanations.

My comments were directed as much to us when we acquire expertise to be mindful of how much our listeners need to know (readers, friends, and family all included). Issac's comments do raise the important issue of how should we interview SMEs and then what is our role to our readers as surrogate SMEs?

Components of expertise


OK, let's the get the obvious out of the way: knowledge. JAVA experts know a lot JAVA syntax and stuff. Historians know a lot of events and dates. That's the easy part.

Studies of experts have discovered that experts see patterns that non-experts do not. For example, athletes talk about being able to "see" the court or the field. Part of why Payton Manning can call such effective audibles is that he can see the patterns in the defense (whereas you or I would see 11 people). When I taught electronics, I would like to start the week by showing a typical schematic we would be dealing with that week and ask the students to estimate how many components there were. The answer was typically "hundreds." At the end of the week I'd ask the question again and the answers were more realistic (20-30). What changed? The students now saw the schematic as a power supply, pre-amp stage, and amp-stage. They saw the patterns and that helped them process the previously overwhelming details.

Another thing research has shown about experts is that their knowledge is tacit--they no longer know what they know. They draw on their knowledge so instinctively they cannot observe their processes. I tried to document a couple of my Dobro picking patterns for a friend and it was HARD! Not the transcription and notation part, but just being able to slow down and see what I did instinctively.

So part of our job as communicators is to help SME's uncover their tacit patterns so we can pass those along to our readers. In that way, we start to transfer expertise instead of just information.

I remember once interviewing an expert and asking what a good starting value was for a particular variable. "It doesn't matter" was all he would say. So I finally said, "OK let's start with a million." In about five seconds we arrived at 35 is a good starting value. After that, it was just "When would you make it bigger? When would you make it smaller? How would I know if it were too big or too small?"

And this isn't just about technical writing. We are all SMEs at something. I've started writing music down and it's forced me to investigate the tacit patterns I've been applying. It's made me a better player and will enable me to be a better teacher if I can ever get one of my grandkids to take up an interest in Hootie's hillbilly music :-)

Tuesday, February 16, 2010

Three Mistakes Experts Make

As user experience professionals, we deal with experts a lot in the form of Subject Matter Experts. And in doing so, we become experts. Plus we deal with experts and expertise in a dozen different forms in our routine lives every day, so it is good to stop and talk about the three big mistakes experts make.

Mistake one: The infield fly rule


Imagine there is a group of you from work at a baseball game--you know, one of those team-building outings. Someone in the group is a non-American, say a Swede over on special assignment. Along about the third inning he says, "Oh I get it, if the fielder catches the ball before it hits the ground, the batter is out. Is that right?" The correct answer is "Hey, you've got it, let me buy you a beer."

But there will be an expert in the group who will feel compelled to explain the infield fly rule. "Well that is often the case but if there are runners on first and second and the batter hits a fly to the infield, then the batter is automatically out regardless of if the ball gets caught or not and the runners do not advance. The reason, of course, is to discourage the infielder from deliberately dropping the ball and then having force outs on all the bases."

Not only is it a buzz kill for the Swede (and anyone else in earshot) it is completely unnecessary. Who needs to know the infield fly rule? The batter? No, because no batter would ever hit a fly to the infield on purpose. The fielders? No, because their play is irrelevant; the batter is out regardless of what they do. The runners? No, they will not be allowed to advance.

So the only person who needs to know the infield fly rule is the umpire, and he can explain it to everyone on the occasional blue moon when it happens.

In many software applications, the computer is the umpire, and as long as it knows what it has to do, don't load down the user with overly technical explanations. Experts want their explanations to be complete and accurate, but user explanations just have to be viable. Agile development has the principle of JBGE "Just Barely Good Enough." An explanation just has to be complete enough and accurate enough to get the user to the desired end goal. In our example, the Swede's understanding of baseball was adequate to enjoy and understand the game he was watching.

Mistake two: Too many explanations


If there is more than one way to do something, the expert will explain them all. At most we typically only need two ways to do something: (1) the easy way to remember and (2) the expert shortcut. Think about getting directions. If you are not comfortable with a particular section of town, would you prefer directions that had only two turns and took you 2.5 miles or one that had seven turns but only took 1.9 miles? Probably the longer but easier. After a while, though, you would probably like to know the shorter way.

Mistake three: The most difficult explanation


My guitar teacher showed me a way to derive all of the naturally occurring chords in a scale on the Dobro (for the key of G). The really useful ones are G, Am, Bm, C. D, and Em. The last one is an F#minor with flatted 5th. This would be like having a group of guys named Al, Bill, Tom, Dave, Fred, and Throckmorton (how did HE get in?) For months I have tried to figure out why this anomaly occurs, this seemingly out of place chord. Today driving in to work I realized it is a D7, a great transitional chord and one that makes sense showing up with the others. (Like finding out Throckmorton is a nickname his great aunt gave him, his real name is Ted.)

But the instructor was right, it could also be called an F#minor with flatted 5th (or a F#dim for that matter). He let himself get distracted, I think, by the sequence of the notes and did not see it in the simpler context.

Conclusion


  1. Look for viable explanations.
  2. Limit the number of alternatives to one easy one and one expert shortcut.
  3. Give the simplest explanation. (See Occam's Razor.)



postscript: In verifying my link I could not help but catch this irony (click to enlarge):

Friday, February 12, 2010

Smacksonomies

In my book Iron Hoop I portray a conversation between the local junk man and a crony about how the junk man arranges things in the junk yard. It is a very thinly disguised metaphor about the inherent problems I have with taxonomies.


The one I have a continual problem with and never get around to fixing is my folder arrangement on my computer. I have a folder called Presentations in which I file my PowerPoint presentations. Within that I have some subfolders for specific conferences or organizations.


But I also have peer level folders for those same organizations and conferences to collect documents and correspondence.

You guessed it; I'm inconsistent with where I store PowerPoint presentations and always have to scratch my head and wonder if it is in Presentations\STC or STC\Presentations.


Essentially I'm conflicted between Object\Audience and Audience\Object. I wonder if there is a natural taxonomy that would guide me, some world view that could serve as a model for these kinds of decisions. I've wondered about the Carnegie Mellon food|shelter|handle research, but that's not getting through.

Thoughts?

BTW, I used to have a similar dilemma with what goes in the columns and what goes in the rows in designing tables. I think I solved that. See my UXmatters column.

Thursday, February 11, 2010

Content over form

What is it about the striking of the Submit button or the opening of the first manual off the press that makes me an expert proof-reader?

The first sentence of an article proposal I recently sent to an editor: "I have contribute in the past to the ..." Subject-verb disagreement in the FIRST THREE WORDS!

The happy ending is that they accepted the proposal anyway. Sometimes the message outshouts the grammar--ain't that the truth!

Tuesday, February 09, 2010

Nouns in 3D

Here's an interesting snippet of research coming out of Carnegie Mellon: How the brain arranges nouns.

Using functional magnetic resonance imaging (fMRI) technology, members of the Center for Cognitive Brain Imaging have gained deep insight into the way human brains categorize objects. In a breakthrough that demonstrates the interdepartmental cooperation here at Carnegie Mellon, neuroscientists Marcel Just and Vladimir Cherkassky and computer scientists Tom Mitchell and Sandesh Aryal have arrived at results that bode well for human-computer interfaces and neuropsychiatry.

Their research has concluded that humans represent all non-human objects in terms of three classes or dimensions. Just defines these dimensions as having to do with eating, shelter, and the way the object is used. He explained that when one sees an object, the brain thinks, “Can I eat it? How do I hold it? Can it give me shelter?” Indeed, all concrete objects are represented in terms of these three dimensions, much in the way that all places in space are represented by the three dimensions that we experience every day.



OK, sounds a little "out there" at first. But think about one of the most ubiquitous of icons and navigational constructs on the web: "Home." Does that say "Gimme Shelter" or what?

Thinking about it a little further, we consume (eat) data, we store (shelter) it, and we move it (hold it) around. I know, the jokes come too quickly. Consider the following alternatives to the traditional Save and Cancel buttons:



Still, I think the research findings are provocative and deserve some serious consideration for UI applications.

Wednesday, February 03, 2010

Hump-day Humor 2010-5

On-the-job negotiations.

Click comic to enlarge it.

Thursday, January 28, 2010

New award, and a cry for help!



My NSS award goes collectively to all of the accessibility web sites on colorblindness that advised me to offer an alternative to using color to convey meaning.

An open question for my readers:
If an IP address in red means one thing and an IP address in blue means something else, what alternative approach would you recommend for this kind of scenario?

Wednesday, January 27, 2010

Hump-day Humor 2010-4

Don't you just love working on distributed teams?

Click cartoon to enlarge it.

Monday, January 25, 2010

I have so been here!

I just had to chuckle when I read Dilbert yesterday. I think a lot of UX departments go through this phase. I hate hiring onto a job and then showing up for this as my first meeting (it's happened more than once).
Dilbert.com

Friday, January 22, 2010

Hump-day Humor 2010-3

Got so busy this week doing real work I totally forgot my Hump Day cartoon.



Click image to enlarge it.

Wednesday, January 20, 2010

Cause and Effect: A Two-way Loop

Had coffee this morning with my friend Ken Hilburn from Juice Analytics. Essentially, they are in the business of helping customers visualize data. Naturally, we talked about dashboards, a big emphasis of his company. The conversation helped close a loop for me between data dashboards and decision-support embedded user assistance.



Any follower of this blog or my column in UXmatters knows that I am a big proponent that user assistance needs to focus more on domain expertise rather than on how to interact mechanically with the user interface. For example, see my article User Assistance in the Role of Domain Expert.

I've always looked at it primarily from the viewpoint of decision support when asking a user to enter a parameter or make a selection on the UI. Essentially the pattern I support is:
  1. Define what the parameter controls.
  2. Advise what a good starting value is.
  3. Advise when the user might want to make it higher or lower (including any tradeoffs).
  4. Advise what the user could monitor to assess the impact of their decision, e.g., specific reports or status screens.
What came out of the coffee klatch discussion with Ken was to link the parameter setting directly to the that part of the application where its impact would be evident--and vice-versa. For example, if lowering a heartbeat threshhold parameter could impact server bandwidth, provide a link to the dashboard or system health screen that monitors bandwidth utilization. That way the user sees the baseline value and knows where to monitor for any negative impacts.

On the reverse side, if you have dashboard readouts that are affected by decisions the user makes in an application, provide that information at the dashboard and provide links back to where those decisions can be modified. For example, the bandwidth utilization dashboard should inform the user what kinds of things can make utilization go up (such as lower heartbeat threshholds) and link back to where those parameters can be adjusted. That way, information is linked to action.

And if you're not going to act on information, what's its value?

Tuesday, January 19, 2010

Embedding User Experience in the Product Life Cycle

I have a new column out today in UXmatters.

All UX professionals, not just user assistance developers, face the problem of integrating their work into the product development lifecycle. At lower levels of organizational usability maturity, too often, the contributions of User Experience tend to be reactive. Usability professionals test the usability of a given product, then designers mitigate any shortcomings they find, and user assistance developers merely document what is already there. This column takes a look at the full scope of the product development lifecycle and how UX professionals can add value. [go to article]

Friday, January 15, 2010

Usability Risk

Granted, I'm still in the honeymoon phase on my new team, but what a honeymoon! This week we had some great meetings with key players to define the role of the UX Architects and add that to the existing team roles. There's a double gasp here. First, this team already has a well-defined Scrum process that identifies all of the roles, and which documents what their activities are before, during, and after a project and a Sprint. Second gasp is that they are including the UX Architect role in that documentation to ensure that we all understand how we support the development of new services.

As we discussed the different things we could do and could produce, we noted that not every feature (or story) would need the full rigor of every UX activity and every UX artifact. So one of our important activities at the beginning of each project and each Sprint is to determine the level of usability risk each feature or story could have and to plan the appropriate level of UX involvement.

Usability risk is a concept that I first learned while working for my Usability Hero/Mentor Loren Burke. Loren had started his usability career at IBM as a project manager in charge of saving programs that had landed in the ditch. He developed a keen sense of the need to identify user acceptance issues that could kill a product or web app and then focus on those items. Shout out to Loren!

The other UX Architect and I now need to codify what criteria we will use to assess usability risk for our kinds of products. My own brainstorming would start with these considerations:
  • Is there a UI? (No UI equates to low risk.)
  • How complex is the UI? (Greater complexity means greater usability risk.)
  • How much interactivity is in the UI? (The more ways there are to interact, the greater the risk.)
  • How tied in is the UI to a critical business driver? (Don't spend resources on features that represent minor impact on the business success of the product or service.)
  • How new are the interactions and content in the UI to the development team? (New stuff carries risk.)
  • How new are the interactions and content in the UI to the user? (New tricks for old dogs are risky.)
  • Where in the user task flow would the UI occur? (Earlier equates to higher risk--users are more judgmental during first impressions. Loren's greatest contribution to usability was creating the concept of "judgment window," the early user experience with a product or service that flavored their ongoing acceptance of the product or service.
What other risk criteria do you use? What would you add to this list?

Thursday, January 14, 2010

The power of an example

I had a really stupid user experience yesterday, so I am trying not to squander my ignorance and understand what went wrong. More importantly, would a difference in the design have helped?

So I'm having to set up a two-layer authentication set of credentials for myself. I have a keychain fob with six digits electronically displayed, and the digits change every 30 seconds (these digits are in synch with the network I need to log onto). So I need to set up a PIN for myself, and then when I log in, I use both my PIN and the currently displayed digits from my fob. So if someone figures out my PIN, they still need my fob to get the digits du jour (or digits du demi-minute in this case) to log in. If I lose my fob, whoever finds it still needs my PIN to get into the network.

OK, for starters, that bit of conceptual knowledge would have been useful up front. I would have understood what I was setting up and how it needed to work when done.

I got an email with a set of instructions. There were liberal screen shots which initially made me feel good (even though I didn't want them to because I'd like to believe that users don't need screen shots.) As it turned out, the shots didn't help and even hurt in a way.

So the process had me create a PIN (4-8 alpha-numeric characters) and then asked me to log in on the screen shown below (click on image to enlarge it):


Up to this point, the instructions had talked about a CODE (the six digits on the fob) and a password (my original system password to get into this set-up-an-authentication task), but had never talked about a PASSCODE. The written instructions had a screen shot (just like the diagram above, but an actual screen capture) and said: Wait for the Next Code on your Token, then enter your PIN+CODE.

When I went back after finally being successful, I realized exactly what that meant and I wondered why I had had so much trouble. Here are the ways I got confused:
  • The screen wanted two pieces of information but I had only one field.
  • What does "response" mean?
  • The screen capture with the asterisks didn't give me any clue as to what should be in there.
  • What I call a fob they were calling a card.
  • What the hell is a PASSCODE?

So I entered my new PIN and pressed Enter. Nope
Entered my system password. Nope
I experimented with CODE then PIN (because most applications I use with a PIN, that's the last thing I enter.) Nope.
Finally, I figured out that I had to type my PIN followed by the CODE on the fob and then click Continue. Well, that's exactly what the printed instructions said (after you wrangle with do you type a + sign or not).

Lessons learned:
  • It's hard to bounce back and forth between screen shots of instructions and the actual screens.
  • An upfront conceptual paragraph that described how two-level authentication worked would have helped.
  • And how about this for the screen design (click on image to enlarge):


Since the code is always six digits and the PIN can be alpha-numeric, I think it's pretty clear what is being asked for once you see the example.