November 17th, 2005 — 12:00am
At least according to the Boston Globe article titled Don’t underestimate the value of social skills, in which Penelope Trunk quotes an HBS faculty member as follows:
’In fact, across the board, in a wide variety of businesses, people would rather work with someone who is likable and incompetent than with someone who is skilled and obnoxious, said Tiziana Casciaro, a professor at Harvard Business School. “How we value competence changes depending on whether we like someone or not,” she says.‘
I guess this explains how we ended up with George W. Bush as President…
No related posts.
Comment » | The Working Life
April 2nd, 2005 — 12:00am
David Brooks Op-Ed column The Art of Intelligence in today’s NY Times is strongly relevant to questions of user research method, design philosophy, and understanding user experiences.
Brooks opens by asserting that that US Intelligence community shifted away from qualitative / interperative research and analysis methods to quantitative research and analysis methods during the 60’s in an attempt to legitimize conclusions in the fashion of the physical sciences. From this beginning, Brooks’ conclusion is that the basic epistemological shift in thought about what sorts of information are relevant to understanding the needs and views of groups of people (nations, societies, political leadership circles) yielded interpretations of their views and plans which were either useless or incorrect, models which then lead decision makers to a series of dramatic policy errors – examples of which we still see to this day.
Brooks contrasts the “unimaginative” quantitative interpretations assembled by statistical specialists with the broad mix of sources and perspectives which cultural and social thinkers in the 50’s used to understand American and other societies in narrative, qualitative ways.
According to Brooks, narrative, novelistic ways of understanding provided much better – more insightful, imaginative, accureate, and useful – advice on how Americans and others understood the world, opening the way to insight into strategic trends and opportunities. I’ve read many of the books he uses as examples – they’re some of the classics on social / cultural / historical reading lists – of the qualitative tradition, and taken away vivid pictures of the times and places they describe that I use to this day when called on to provide perspective on those environments.
Perhaps it’s implied, but what Brooks doesn’t mention is the obvious point that both approaches – qualitative and quantitative – are necessary to crafting fully-dimensioned pictures of people. Moving explicitly to the context of user research, qualitative analysis can tell us what people want or need or think or feel, but numbers give specific answers regarding things like what they’re willing or able to spend, how much time they will invest in trying to find a piece of information, or how many interruptions they will tolerate before quitting a task in frustration.
When a designer must choose between interaction patterns, navigation labels, product imagery, or task flows, they need both types of understanding to make an informed decision.
Some excerpts from Brooks’ column:
“They relied on their knowledge of history, literature, philosophy and theology to recognize social patterns and grasp emerging trends.”
This sounds like a strong synthetic approach to user research.
“I’ll believe the system has been reformed when policy makers are presented with competing reports, signed by individual thinkers, and are no longer presented with anonymous, bureaucratically homogenized, bulleted points that pretend to be the product of scientific consensus.”
“But the problem is not bureaucratic. It’s epistemological. Individuals are good at using intuition and imagination to understand other humans. We know from recent advances in neuroscience, popularized in Malcolm Gladwell’s “Blink,” that the human mind can perform fantastically complicated feats of subconscious pattern recognition. There is a powerful backstage process we use to interpret the world and the people around us.”
“When you try to analyze human affairs using a process that is systematic, codified and bureaucratic, as the CIA does, you anesthetize all of these tools. You don’t produce reason – you produce what Irving Kristol called the elephantiasis of reason.”
Related posts:
Comment » | User Research
March 15th, 2005 — 12:00am
A recent article from ZDNet — Researchers: Metcalfe’s Law overshoots the mark — reports that two researchers at the University of Minnesota have released a preliminary study in which they conclude that Metclafe’s law significantly overestimates the rate at which the value of a network increases as its size increases. The study was published March 2, by Andrew Odlyzko and Benjamin Tilly of the university’s Digital Technology Center.
Here’s some snippets from the paper:
“The fundamental fallacy underlying Metcalfe’s (Law) is in the assumption that all connections or all groups are equally valuable.“
I’m always happy to find a declaration in support of quality as a differentiator. Of course, quality is a complex and subjective measurement, and so it is no surprise that Odlyzko and Tilly first recall it to relevance, and then continue to say, “The general conclusion is that accurate valuation of networks is complicated, and no simple rule will apply universally.“
It makes me happy when I see smart people saying complicated things are complicated. Odlyzko and Tilly are academics, and so it’s in their interest for mostly everyone else to believe the things they study are complicated, but I think that there’s less danger in this than in basing your business plan or your investment decisions on a fallacious assumption that a very clever entrpreneur transmogrified into an equation — which somehow by exaggeration became a ‘law’ — in a moment of self-serving marketing genius. I know this from experience, because Im guilty of both of these mistakes.
Moving on, as an example, Odlyzko and Tilly declare,“Zipf’s Law is behind phenomena such as ‘content is not king’ [21], and ‘long tails’ [1], which argue that it is the huge volumes of small items or interactions, not the few huge hits, that produce the most value. It even helps explain why both the public and decision makers so often are preoccupied with the ‘hits,’ since, especially when the total number of items available is relatively small, they can dominate. By Zipf’s Law, if value follows popularity, then the value of a collection of n items is proportional to log(n). If we have a billion items, then the most popular one thousand will contribute a third of the total value,
the next million another third, and the remaining almost a billion the remaining third. But if we have online music stores such as Rhapsody or iTunes that carry 735,000 titles while the traditional brick-and-mortar record store carries 20,000 titles, then the additional value of the ‘long tails’ of the download services is only about 33% larger than that of record stores.” {citations available in the original report}
This last begs the question of value, but of course that’s also a complex and subjective judgement…
And with this they’ve introduced context as another important criterion. Context of course can take many forms; they make most use of geographic locality, and then extend their analysis by looking at how common interest in content on the part of academics functions as another index of locality, saying, “Communication networks do not grow independently of social relations. When people are added, they induce those close to them to join. Therefore in a mature network, those who are most important to people already in the network are likely to also be members. So additional growth is likely to occur at the boundaries of what existing people care about.“
The references alone make this paper worth downloading and scanning. Read more of Odlyzko’s work.
No related posts.
Comment » | The Media Environment