Monthly Archives: March 2013

Story technique has been with us for 2300 years – it’s time to brush up on it

2300 years ago Aristotle wrote “The Poetics” which contained his secrets of storytelling. Being human, these secrets haven’t really changed.

Many research presentations I’ve seen, including many of my own, have been bogged down by too many facts and figures. It is like reading a book which is so full of florid description that one begins to skip pages and start looking for the action.

Likewise stories can suffer from relentless action – the type we see in Peter Jackson movies where we get chase, fight, chase, fight, another chase, another fight followed by another chase – and the net result is just plain boredom. His King Kong movie is one of the few films I’d rate as un-watchable. It ignored the storytelling craft. It was all pageantry (look at our CGI techniques!) and no drama.

We do the same with research. We go heavy on descriptive results – without telling the true story. Or we go heavy on special effects (I do this too much: showing off analytic techniques) but forget to tell the story. Or we simply have a story but we don’t tell it with any craft. We muddle it up. The drama is in there somewhere but we didn’t quite extract it.

This seems crazy, because the craft of telling stories – the techniques and skills required – have been part of our pantheon of written human knowledge for 23 centuries. Storytelling goes back to the dawn of civilization  but the Greeks started thinking about the craft, and analysing it, and applying systematic rules to it since Aristotle considered the subject.

Why not? As Steven Pinker explains it, storytelling has universal elements across so many cultures to lead him to conclude that stories are part of how our brains are wired. They reflect how we think. We’re engineered to tell stories.  Stories are a means of processing complex visual, verbal and emotional information.

In the 20th Century much was written about story telling craft as writers considered modern day psychology and found, among other things, how well Shakespeare captured the human condition. You could pick apart Othello and find it stood up to a Jungian framework, or to modern theories of the human condition. Writers such as Lajos Egri who published the seminal guide for playwrights The Art of Dramatic Writing helped create the debate about what drives a good story: is it events and action, or is it character? He concluded that character was at the heart.

So here is a good definition of what makes a good non-fiction story, summed up by American Jon Franklin in his work: Writing for Story.

“A story consists of a sequence of actions that occur when a sympathetic character encounters a complicating situation that he or she confronts or solves.”

Sounds simple, and – actually – it is. But the next layer down is where the story craft gets more complicated:

  1. Giving the story some structure. Do we start at the beginning and build to a conclusion? Or do we start at a critical moment of decision – and then go back and fill in the back-story and offer the options that our lead character faces?
  2. Choosing from whose point of view we tell the story. (Do we tell it from the brand’s point of view?  Or the customers’ point of view?)
  3. Characterisation. Do we paint the Brand as a hero? Or is it a flawed everyman? Are those customers a roiling mass – a Spartacus uprising in the making? – or are they the savvy, price-seeking satisficers who are undoing the good work of marketing? Who are the goodies?

These are just some of the decisions we must make when we tell stories, and they require a lot of forethought and imagination. The process is far different from the usual art of starting a PPT deck with Q1 and working through to the results of the final Question. I wrote a teen-novel once, (The Whole of the Moon) which did quite well but I spent a month deciding whether to tell it first person or third person.

Well before then, working in TV, I learned in drama editing and writing just how important it is to find a congruency between action and character. The decisions made by the protagonist (he kills his attacker) need to be within the realm of possibility for that character. (Would Coca-Cola really do this??)

I also learned that good stories need some relief. Shakespeare would open each act with a couple of fools joking around: something to get the rowdy audience engaged before launching into a Lady MacBeth tirade.  In client presentations or conferences I try the same, and the light moments may seem like diversions, but they always have a point – they put across the enjoyment we’ve had in the project, or they give a bit of anecdotal evidence of the dramas and challenges we faced in the survey: the day the blizzard held up the fieldwork.  These diversions humanise the story, and connect the storyteller to the audience.

Writers can get into a groove and employ hundreds of these little lessons instinctively – but it is increasingly important that researchers and analysts now also learn some of these techniques. We have audiences who want to digest the main thrust of what the data is saying.

And as story tellers we don’t want them to walk out on us.

Even modern theatre employs the lessons from ancient Greece. After all, it is all about people and how they make decisions when complications get in the way of their objectives.

Why story telling is going to be the requisite skill of the future.

With the rising tide of business data, our human capacity to interpret and understand all this information will need to shift gear.

Already in the last 5 years organisations have been shifting out of flat spreadsheet land, in which numbers are presented (pies and bars or simple statistics,) based directly on platforms as simple as Excel.  Ten years ago an executive could bark at their analyst: “Give me the numbers!” and really, they could still handle it – all the numbers.

But now that volumes have grown, the pies and bars and simple outputs are not enough – and one response from organisations is to dashboard the data (creating look-up systems to help wade through all the layers of data,) or ask researchers to do a better job of compiling the story from its multiple sources – those customer feedback channels combined with social media streams and integrated with sales data.  This is just about manageable at present – but with the availability of relevant data soaring by an estimated (McKinsey) 45% per annum, current day solutions are going to struggle within 18 months or so.

People are simply getting swamped.

But the answer is available, and it involves a shift of paradigm from a numbers-reporting focus, back to a storytelling focus. After all numbers only ever represented the story to begin with. As I say – data is not about data, it’s about people – and always was. The sales figures? They tell the stories of thousands of customers who made individual decisions.

Story-telling, and the capacity to understand and recall stories,  is a fabulous human capability that we develop from infancy. Through stories we learn about the complexities of our social moral codes, or about elements of human character that are to be enshrined. This is complicated stuff, yet easily interpreted once delivered within a clear story line. We are wired this way. Luckily. We can handle Shakespearian themes, we can understand great tragic turning points and the ins and outs of the complex human condition. We can do this at age 16 – we don’t need an MBA to understand a story.

Now storytelling is an art, and it involves a lot of skills and story-craft. It doesn’t need to be high literature to succeed (hey, we have John Grisham et al to prove that simple techniques can entertain us) but increasingly it will be a requisite skill of the near future.

Analysts will need to know the difference between narrative (A King died, then the Queen died,) and plot (the King died, then the Queen died, of grief.) They will need to be stronger and picking out the information that explains why things happen, and stronger at asking questions to give us better, more powerful data about human motivations. (Demographics are not a strong basis for a story.)  Most of all they’ll need empathy – a nose for a good story and the capacity to assemble the facts, interpret what’s going on (it is there somewhere amongst all those billions of lines of information,) and get up in  front of the CEO and be able to say:

“Chief, I want to tell you a story…I want to tell you a fable that reminds me of King Canute….”

In other words to boil all that information down into a drama that can be processed on a human scale by a board of directors.

All our marketing activities. All our business challenges. All those facts and figures about a changing society.  They are not about numbers. They’re about people and about the stories of those people.

How equipped are we to understand these, en masse, and to tell these tales in a form that enables our employers to understand, amongst the blizzard of numbers, that their company is seen, basically, as the Grinch that Stole Christmas?

What Hugo Chavez taught me about decision-making.

Hugo Chavez and my Uncle made a massive decision through their joint belief in clarity.

I’ve never met the late Hugo Chavez but I am one degree of separation from Venezuela’s revered leader thanks to his dealings with my uncle, Rod Stuart who was based in Montreal. And Rod had this very instructive story for me as a researcher when once I had tried to impress him with the grunty statistical work I could do.

I thought I’d be impressing my Uncle who was a civil engineer and the man in charge, the person at the very top, of several hydro projects world-wide. He was responsible for, or consultant to (I think) 6 of the largest hydro projects on our planet including the Three Gorges project, the massive Canadian Churchill Falls project, a huge Pakistani damn built in the 1960s as well as the top-10 ranked Simon Bolivar Hydro project in Venezuela.

It was on this project that uncle Rod was asked to consult. He received a phone call in Montreal directly from Hugo Chavez asking Rod to come to his palace. “Duncan,” my uncle advised me, “if any world leader asks you to meet at their palace my best advice is to catch the next plane.”

So he reported to Chavez who was trying to sign-off the new hydro project. “What’s the problem?” my uncle asked.

“The engineers,” said the president, “they’ve recommended two sites for the hydro project – we could flood this valley over here…or,” he said, pointing to a map, “we need to drown this valley over there. But which one?”

Rod knew the engineers who had written the massive report: “Mister President, the guys who wrote that report are very good. They practically wrote the book on hydro decisions.”

“That’s the problem!” barked Chavez. “I don’t want a book, I just want an answer.”

So my uncle agreed to read the report over the next 48 hours, and deliver a recommendation for the President.

“It was an enjoyable two days,” Rod reported. He was in his 70s at the time. “My hotel room looked over the swimming pool and suddenly I realised why Miss Venezuela always wins Miss Universe. Duncan, they all looked like Miss Universe.”

Two days later he reported back at the presidential palace. The report, he said, was very thorough, and had considered geo-physical risks, return on capital, social costs, time frames, delivery of electricity, worker safety, climate…the whole rich texture of risk and return on a massive capital project.

“…and?” asked the president.

“Mister President, what the report is really saying is: it’s about 50/50. So I have two questions for you, and then we can come to a decision. My first question is this: are you certain you want to build a hydro project at all?”

“Of course I am,” said the president  “We are a growing country and we do not want to be energy dependent.”

“Then it comes to this. If on economic grounds both sites are about 50/50, and on risk terms they are about 50/50, and on engineering terms they are about 50/50 and the social cost of drowning this valley here is the same as drowning that valley over there…if it’s all about 50:50 then here is my second question. Mister President, do you have any personal reasons for choosing this valley or that valley?”

Hugo Chavez looked at the map and weighed his words delicately. He pointed to one of the valleys and reflected, “You know, my mother grew up in a little village over here….”

“In that case,” said Uncle Rod, “I suggest we build the dam in the other valley.”

Rod told me the story because he wanted me to understand that decisions, big or small, have no need to be over-complicated. Often in statistics we are compelled to test whether one number is “statistically” higher than the next. Rod’s point: if you’re having to test whether there’s a difference, then really in real terms there isn’t any difference. For that reason he and Hugo Chavez were able to reduce what started off as a complex quadratic equation, and layer by layer cancelling out the differences between Option 1 and Option 2. In the end, having taken a step back to check that any decision needed to be made at all, the difference between Option 1 and Option 2 was really a test of whether the president could sleep more comfortably with one choice over the other.

Both men knew they didn’t live in a perfectly black and white world, and both knew – to their credit – when to stop focusing on the decimal points and when to simply make a decision.

Rod found Chavez to be an open-minded, intelligent man. His words: “I don’t want a book, I just want an answer.” are a credo that researchers ought to live by.

Lifting your productivity by 20%. The reporting log-jam

Cleared for take-off? In MR there’s generally a log-jam around the reporting process.

The photo above shows hundreds of gannets near a beach where I live, and these birds nest here, grow up here and then take flight to distant nations thousands of miles away. It is humbling to watch them survive on this rock face, and amazing to see how the parent birds always manage to identify which grey hatchling is their’s.  How do they do it? How does this society, this organisation of gannets manage to be so efficient?

I wonder similar thoughts about Market Research organisations also, because in these companies you see things happen that the gannets don’t bother with. The birds hold no WIP meeting, no pep talk, to team building exercises – there’s just the constant cry of their vocal equivalent of the email network. These birds are in constant communication. The parents are task focused (got to feed the young) and the local fishing grounds, with the exception of the occasional Great White Shark, are benign and plentiful. Field work, in other words is not a problem.

But the gannets do stumble at one point. Sooner or later comes the great migration and these rocks will be empty for a season. Yet not every chick is ready at the same time. While some quickly master the art of gliding in the prevailing westerly breeze, others are clumsy, and apt to crash land in an ugly flurry of gangly feathered wings and webbed feet. If this was Heathrow or JFK this would be mayhem.

Now the moment that projects get delivered to clients is similarly full of mayhem.  Some of major inefficiencies  of the typical research company occur at the reporting stage. Deadlines might be met, but these too often involve a weekend or a serious late-nighter. And if that’s the case, then something need fixing.

So here are some suggestions to contribute to my ongoing series about how to achieve an overall 20% lift in MR company efficiency. Twenty-per-cent is very achievable, and really it comes down to find a 4% gain here, a 5% gain there – as well as the courage to challenge a few things that were developed back in the 80s or 90s; for example the production-line structure.

  1. Plan the report at the questionnaire development stage. Presumably you are testing hypotheses, or measuring certain things, or unravelling mysteries – whatever you’re doing there’s going to be a story that gets developed, even if you don’t know how it is going to end. So start visualising the shape of the story and the chapters it will require and the analytics that these will involve. Let the whole team know what the plan is.
  2. Do not pass Go…until you adopt a stringent “right first time” approach to labelling errors and data errors. Test and proof the data before the report gets drafted. Are the weightings correct? Do the labels have typos?  Is the data clean? Measure twice, cut once.
  3. Challenge the multi-stage process behind the survey analysis. Having a DP department develop a whole heap of crosstabs in the hope that something should prove interesting, is just plain inefficient.  If a team is working on the project, then work together in parallel rather than in serial. Start by going through the report structure, and then allocating who works on which part. By waiting in series, the project gets held up by every little random thing and cascade delays and errors can mount up. “Sorry, I can’t work on the data yet…Dave had to go to the dentist.”  You know the …er, drill.
  4. Run these teams top-down rather than junior up.  I recently worked with a j-up style of organisation in which us seniors were encouraged to delegate the production or reports, and then to add “our bits” at the end of the process. I personally found this frustrating because what the reports often needed was much more than cosmetic. The younger researchers, good talent all, hadn’t always “got” the story that the data was telling. So precious hours were spent reworking key chapters of the report.  
  5. Examine and use suitable production platforms.  A good one to try is the Australian product “Q” delivers a combination of SPSS power, Excel level exportability to PPT and delivers slides (complete with Question number, n = size, date etc) at the touch of a button.  It can save hours of needless production time when analysts end up messing with fonts and colours.
  6. Talk to the client during the report-writing process. Run the initial findings past them in the conversation: “Delwyn, we’ve found a high level of dissatisfaction among your core customers…is this what you guys expected? What would be the best way to report this?”  Often those conversations provide a context-check which enables you to get the nuance and tone right for the audience. Delwyn might advise you to go easy with the bad news (because the new initiative was a pet project of the boss) – whereas if you didn’t know that you’d be up, sooner or later, having to redraft the report. 
  7. Monitor productivity and set expectations. I’ve never seen a research company do this. But if a PPT deck looks like it needs 6 basic chapters, and each chapter is going to need around 10 slides, then you should be able to work out how much production time needs to be allocated. Personally, working with my own data, I can put together a slide every 8-12 minutes depending if the project is descriptive or deeply detailed. That includes the analysis time. Yet meanwhile I’ve timed colleagues (they didn’t know this) and they averaged around 15 minutes per slide of descriptive results. About half my speed. The difference I think comes from: certainty of the story I’m telling, a clear sense of structure, and the general tendency to give slides a reasonable amount of white-space. Focus on the main details, don’t present a full tables of results. Each project should discuss the hours spent, after delivery to see where improvements can be found.
  8. Start on the reporting immediately. You may think you have two weeks available to get it all together. So no pressure is applied at the start. In fact, most projects lose time in the first 72 hours – and that puts a squeeze on the rest of the schedule. That’s when errors get made or compromises (we don’t have time to dig deeper) occur. 

Research companies need to ask themselves how much time is spent doing actual research (analysis and thinking) and how much time is spent crafting massive decks of slides. For sure, the report represents the basic deliverable to the client, and – absolutely – it needs to be visually attractive, and tell its story even to non-analysts. But we’ve all seen too many hours wasted in dickering around with the look of the report, or in fixing errors that got woven into the report – and not enough hours spent delivering value to the client.  



Now it gets personnel: Gurus and Geeks – the architecture of the Big Data universe

The universe of analysts in the world of Big Data. Where do you live?

Over the past few months I’ve been looking at what I believe to be a major meltdown of the Market Research polar caps. The growth of the industry, once assured, has turned slushy and meanwhile the growth of Big Data as a field of endeavour remains double-digit. If anything it has accelerated.  So how much effort will market researchers have to make if they wish to hitch their caboose to the big growth engine that’s running on the track next door.

It comes down to people and their skills and outlooks, so where I started my investigation was in the employment ads relating to Big Data. Ouch. The help wanted ads are dominated globally by vacancies for “data geeks” (and that’s the phrasing they choose to use) and the qualifications revolve around technical skills (typically SQL or more advanced) as well as basic statistics. Very few ads ask for Big data architects who can visualise and steer the mountains of digital data that every large firm is accumulating. I foresee a big trainwreck occurring unless a few more subject matter gurus – architects who can see the big picture – are employed in the Big data locomotive. Wanted, a few more Denzel Washingtons.

There’s another axis to the landscape, as I see it. This borrows heavily from the thinking of John Tukey, our statistical Godfather, who classifies stats into two zones: the Descriptive side (accurate reporting, concern about margins of error and significance etc) as well as the Explorational side where new patterns are being discerned, rare events are being predicted and fixation with decimal points can be quietly put aside. This is the realm of game theory, of neural networks, of unstructured data and just about all the tools that my colleagues in Market Research generally avoid. 

But while MR practitioners seldom live in the northern hemisphere of my diagram, above, not that many Big Data analysts, really, are working in that zone either. There will be strong demand, probably increasing demand for those people.

If Big Data analysts have a centre of gravity somewhere in the yellow square of my diagram, market researchers dwell, predominantly, over in the green zone. They’re good subject matter experts though not great explorers.

In respect of tomorrow’s business needs, I’m picking that most Big Data teams will require a mix of skills – people from each quadrant, or a number of generalist experts – those exceptional individuals (and I’ve met a few in both MR and BD) who dwell in the centre of the data universe – by turn gurus and technical experts, one minute retrieving old numbers and making them sing – and the next minute devising predictive models to illuminate tomorrow’s business decisions.  

Trends?  The world of business analysis will see shrinkage of MR as more and more data is retrieved from other sources. Meanwhile organisations will get quickly swamped with descriptive data and the Gods of tomorrow will be the Guru Explorers who can see the future and what they need – and are surrounded by the Explorer Geeks who can stoke the boilers and make the engine roar.


You think your spreadsheets are error free?

You think your spreadsheets are error free?

Researcher Ray Panko conducted a meta-study of spreadsheet quality and his conclusions, from a wide number of studies are, frankly, disturbing.  This link comes out of debate about Excel’s capacity to have cascade errors rippling from one sheet to another, and concerning Excel’s functions to not necessarily do what you think they are doing. Scary stuff and worth a read.

Social Network Analysis – a quietly useful research approach

Social Network Analysis helps us understands systems, repertoires and peer influences that help drive our behaviours.

The first social network diagram was, I believe, drawn in 1935 by a sociologist Mareno, who was describing the interactions between a handful of people he was studying.

Decades later of course, we now have the computational power to describe networks not just of small groups, but larger groups – hundreds, thousands or, I suppose – millions, as we see in the network clouds that map the political blogosphere, or the social interest clouds that define the woolly landscape of Facebook membership.

Social Network Analysis isn’t hard to conduct and of course there are very good freewares available sufficient to make it quite easy for any MR professional to spend an afternoon getting themselves familiar with the possibilities. I’m surprised I don’t see the fruits of SNA everywhere. Why is it so useful?

The answer is simple. First: people are social and are influenced by peers. So robust studies (the Framingham study for example) demonstrate quite simply that smoking is not just an individual choice, but very much peer driven thing. Smokers, it turns out happen to live within networks of other smokers. Obesity follows a similar pattern. Put simply, if you are surrounded by large people, then large is your “normal.”  I would imagine this social network effect applies to brand usage (the new product that everybody in the book group was raving about) and other individual choices which – if you put your network glasses on – become a lot more peer-driven than anyone might guess.

So that’s the main reason for SNA. People are social.

But there’s a second reason also. We tend to view things in terms of repertoires, clusters and systems.  When I say that I prefer to avoid rush-hour, what I’m really saying is that I’m trying to avoid a whole system of issues that culminate in lengthy travel time.  If you ask me what fruit I buy each week, my answer isn’t simply based on my favourite fruits in ranked order, but by my belief that I need a balanced diet – citrus, apples, stone fruit and bananas and maybe kiwis.  I wouldn’t dream of a purchase without some citrus but also bananas.  I see my choices not as a collection of individual choices, but as system that gives me a balance of flavour, goodness and value. In fact what Carlo Magni and I did was use fruit purchase data to create a SNA based not on people, but on fruit in a typical fruitbowl – in Japan the “system view” is very different from the “system view” in my home country of New Zealand. Bar charts would not have helped us visualise this so clearly. SNA gave us a real insight into the working heuristics of the Tokyo fruit buyer. We could also understand how seldom-mentioned fruit (yummy persimmons) fitted into the larger system, and why certain types of fruit – through poor definition – have trouble “breaking into” the social network of the typical fruitbowl.

There are two more reasons for using SNA and I’ll touch on these very briefly.

Reason three is that SNA’s produce a plethora of measures you didn’t expect to get.  When you generate a diagram, as above, you also generate a number of statistics for each node (or individual) in the network. Two useful measures are:

1) Between-ness. The degree to which a player connects two or more quite disparate groups within the network. In an organisation there may be just a few people linking Silo 1 to Silo 2.

2: Eigenvector.  The degree to which a player is plugged in to the wider network.

The fourth reason for using SNA is clarity. Clients easily ‘get’ a social network diagram, and can easily see how products, or people might glue together – or be disconnected.

Of course they easily get it. They’re human.