Saturday, December 09, 2006

Sensemaking in the zeitgeist

This follows up on an earlier post that mentioned the conflict between principled vs intuitive approaches to design.

Much of my interest since the 1980s has been in the idea of sensemaking, what happens when individuals or groups in the course of their work hit something they don't understand or can't get past. Communication and other breakdowns often occur at such times, people don't know what to do or have conflicting or confused ideas as to the right actions to take. I think I'm pretty good at coming up with structures, tools, and interventions to help at those moments, whether that's UI design, user guides, presentations, training materials, facilitation, hypermedia discourse, etc.

This is in the face of, and often in opposition to, situations where everyone seems to know what needs to be done, how it needs to be done, what it means in the context, how things should be prioritized, often with much greater clarity, nuance, and in more dimensions than I do. I often feel not very intelligent about understanding the bulk of what is generally known, at least at first blush. If I throw myself into the situation, get immersed in the details and how they relate together (often in the context of having to create one of the interventions I spoke of above), then I can get some of that understanding. I'm pretty good at looking deeply at something, getting its nuances in pursuit of some specific end; but not very good at just generally picking up the zeitgeist, slipping unproblematically into the shared pool of knowledge that others seem to pick up just by being around.

I've seen this very clearly at my job (in a large systems development organization). Most everyone else seems to better understand the technical stuff -- servers, security processes, testing protocols, configuration management, etc., and also the subject matter of the applications we develop, especially how all the systems and processes relate to one another. Even in an area I'm ostensibly more expert in, user interfaces, many seem often to know more fluidly and quickly than me what needs to be in an application's UI, what the user experience should and shouldn't be. The kinds of interventions and comments I make often seem more superficial, "cosmetic", on the wrong or at least a not very important level (not as important as other levels and interventions that have to happen). Engineering, logistical, and operations considerations trump all else, and this is not to imply that this is wrong; these considerations are indeed often much more what matters, or at least what needs to minimally be done and done right for the application to get delivered on time and function in its environment at all. I feel that I ought to have more insight, due to my experience, level, skills, enlightened worldview, etc., but paradoxically I often seem to have less. Part of this is good, if humbling. I don't (or at least, constantly realize I shouldn't) inflate my own importance and wisdom. I listen more to the insights of others and recognize their skills and experience.

On the other hand, there are limits to what the zeitgeist, the common wisdom, the generally known, produce. From an IT application development point of view, this often comes out in the level of quality and usability that a particular system attains. The zeitgeist seems programmed to deliver functionality that can be attained by a certain date, within constraints, and that 'works' in the existing environment, but the rub lies in what it doesn't do, the things that don't add up, that are invisible or given less attention than they need, until the system hits the factory floor. There is usually not the time or resource to give these considerations their due in the system development lifecycle. When they come up, they present dilemmas and conundrums -- akin to the sensemaking moments I was talking about above. They require a different level of thought and intervention and the zeitgeist doesn't help much. The considerations are unique, not generic, and have deeply to do with the intricacies and subtleties of the relationships of the parts to the whole, especially as a user will encounter them. And they are usually not things that simply asking a user could have surfaced in advance -- they only come out when the new application exists, when it has tangible form that someone can interact with (though certainly, as user-centered design advocates would argue, sometimes the problems could have been avoided through upfront user research, low-fidelity prototypes. But only sometimes). It takes a ton of work to get to the point where you can even see the problems emerge. At that point, different skills are required. I'm happier and feel less stupid there. But those points are not the preponderance of what goes on.

1 comment:

Al said...

Since I wrote this, I have experienced a validation of much of what I tried to say above. At my job we've been involved in the launch of a pretty large web-based internal ordering system. The first few months of it were organized around the kind of functional/logistical/engineering considerations I described. But it became clear when users worked with the early versions that, although there was a ton of functionality in the product, people were getting lost and confused, rendering much of the functionality unusable. We were able to successfully redesign a lot of the surface appearance without having to re-engineer most of the underlying code, and the "different skills" are what made the difference. So possibly I'm not as lame and non-expert as the above might sound :-)