Posts tagged: hidden brain

Do we really need the frying bacon close-up?

bacon fryingThe scene opens with a wide view of Owen leaning over the stove. Next is a close-up of Owen’s face peering down at the skillet, a bead of sweat dripping from his forehead. For two seconds we see a close-up view of sizzling bacon before returning to a wide view of Owen scooping the bacon out of the pan and carefully placing it just so on a plate of eggs and French toast. Cut to a scene of Owen bringing this newly prepared breakfast to his bride in bed.

”Happy Anniversary, honey.”

The budget conscious movie producer drops the script on the table and stares at the director.

“Do we really need the close-up of Owen’s face? The set-up for those shots adds a ton of extra cost. And the bacon close-up? Really? Does that really add anything to the story? Are we going to sell even one less ticket if that shot is not in the movie?”

But the director insists, “Yes, we have to have those scenes. They add the emotion and visceral impact that is required to tell the story, to let the audience feel Owen’s love. They are as essential to the story as the dialogue. Those shots are the difference between a professional film and a home movie, and no one will pay to see a home movie. They may not list the close-ups as the reason they don’t like the movie, but trust me, they’re a much larger factor than you think.”

The director is right. (And don’t worry, this post will eventually get to the retail relevance.)

I’ve been reading a lot about how our brains make decisions. Books such as How We Decide, The Hidden Brain, and Switch all explore the two parts of our brains that combine to formulate our decisions. Scientifically, those parts of the brain are the neocortex and the amygdala. In Switch, the Heath brothers call them the Rider and the Elephant; others call them the rational brain and the lizard brain. Whatever we call them, our decisions are the combined effort a conscious part of our brains that control our rational thinking and an unconscious part of our brains (the Hidden Brain) that controls our emotions.

Think you don’t make emotional decisions? Think again.

It turns out that without our emotional brains, we wouldn’t be able to make decisions at all. In How We Decide, Jonah Lehrer recounts the story of a man whose brain injury caused his amygdala to stop functioning. As a result, he was utterly incapable of making even the simplest decisions in life. Without an emotional brain to push him toward a decision, his rational brain simply went into analysis paralysis.

Our brains are extremely powerful, but they’ve got a lot going on. As a result, they basically compartmentalize processing power and take shortcuts when encountering situations that seem similar to past situations they’ve encountered. While this compartmentalization is generally very efficient, it has its drawbacks. Here’s how Shankar Vedantam explains it in The Hidden Brain:

The conscious brain is slow and deliberate. It learns from textbooks and understands how rules have exceptions. The hidden brain is designed to be fast, to make quick approximations and instant adjustments. Right now, your hidden brain is doing many more things than your conscious brain could attend to with the same efficiency. The hidden brain sacrifices sophistication to achieve speed. Since your hidden brain values speed over accuracy, it regularly applies heuristics to situations where they do not work. It is as though you master a mental shortcut while riding a bicycle—bunch your fingers into a fist to clench the brakes—and apply the heuristic when you are driving a car. You clutch the steering wheel when you need to stop, instead of jamming your foot on the brake.

Now imagine the problem on a grander scale; the hidden brain applying all kinds of rules to complex situations where they do not apply. When you show people the faces of two political candidates and ask them to judge who looks more competent based only on appearance, people usually have no trouble picking one face over the other. Not only that, but they will tell you, if they are Democrats, that the person who looks more competent is probably a Democrat. If they are Republicans, there is just something about that competent face that looks Republican. Everyone knows it is absurd to leap to conclusions about competence based on appearance, so why do people have a feeling about one face or another? It’s because their hidden brain “knows” what competent people look like. The job of the hidden brain is to leap to conclusions. This is why people cannot tell you why one politician looks more competent than another, or why one job candidate seems more qualified than another. They just have a feeling, an intuition.

This same “leap to conclusion” occurs when people visit our websites. They come to our sites with a preconceived notion about what a quality website looks like, and many times those preconceived notions have much to do with the types of design elements that many “rational” thinkers would equate to the frying bacon close-up described in the movie scenario above. It’s hard to imagine how a rounded borders versus straight borders might effect someone’s likelihood to convert, but it will because the hidden brain is making lightning fast decisions about a site’s credibility based on everything it sees and how closely what it sees matches up to its past experiences with what it found to be credible websites. A customer will not likely point to border type as a reason she didn’t buy; she’ll just feel uneasy enough about the site that her ultimate decision to buy will go negative.

Conversely, the right design can play a huge role in increasing a site’s credibility and turning that decision to buy in the right direction. For example, there have been numerous experiments conducted over the years that show how the price of a bottle of wine can genuinely affect people’s taste. In his blog, Jonah Lehrer discusses the wine experiments and “The Essence of Pleasure” and shows how paying close attention to the “essence of a product” or a site, like “Coors being brewed from Rocky Mountain spring water, or Evian coming straight from the French Alps” can actually lead to a change in sensory perception. This, of course, is what good branding is all about and it can absolutely make the difference between new customers further engaging with our sites or bouncing off to another site.

Since customers won’t generally be able to tell us about specific design elements that are causing them discomfort, we need to use various techniques to help us get to the heart of the truth. Multivariate testing can be a great way to understand the immediate value of different designs. Combining multivariate testing with a predictive voice of customer methodology like the ACSI methodology used by ForeSee Results (shameless plug) can really help us understand the long-term brand impact in ways that simply multivariate tests alone cannot. It’s critically important to understand our customers’ perspectives on design in context with their overall future intentions in order to get to a truth of design’s impact that even the customer could not tell us directly.

Metrics and methodologies can point us in the right direction, and then we need to hire and trust talented, professional designers to do their thing. In the end. high-quality, professional design speaks well to the hidden brain and leads to enhanced credibility. Enhanced credibility facilitates a better selling environment. So, yes, we really do need the frying bacon close-up.

What do you think? How is design treated in your organization? What tips do you have? Or are you not buying it?

11 Ways Humans Kill Good Analysis

Failure to CommunicateIn my last post, I talked about the immense value of FAME in analysis (Focused, Actionable, Manageable and Enlightening). Some of the comments on the post and many of the email conversations I had regarding the post sparked some great discussions about the difficulties in achieving FAME. Initially, the focus of those discussions centered on the roles executives, managers and other decisions makers play in the final quality of the analysis, and I was originally planning to dedicate this post to ideas decision makers can use to improve the quality of the analyses they get.

But the more I thought about it, the more I realized that many of the reasons we aren’t happy with the results of the analyses come down to fundamental disconnects in human relations between all parties involved.

Groups of people with disparate backgrounds, training and experiences gather in a room to “review the numbers.” We each bring our own sets of assumptions, biases and expectations, and we generally fail to establish common sets of understanding before digging in. It’s the type of Communication Illusion I’ve written about previously. And that failure to communicate tends to kill a lot of good analyses.

Establishing common understanding around a few key areas of focus can go a long way towards facilitating better communication around analyses and consequently developing better plans of action to address the findings.

Here’s a list of 11 key ways to stop killing good analyses:

  1. Begin in the beginning. Hire analysts not reporters.
    This isn’t a slam on reporters, it’s just recognition that the mindset and skill set needed for gathering and reporting on data is different from the mindset and skill set required for analyzing that data and turning it into valuable business insight. To be sure, there are people who can do both. But it’s a mistake to assume these skill sets can always be found in the same person. Reporters need strong left-brain orientation and analysts need more of a balance between the “just the facts” left brain and the more creative right brain. Reporters ensure the data is complete and of high quality; analysts creatively examine loads of data to extract valuable insight. Finding someone with the right skill sets might cost more in payroll dollars, but my experience says they’re worth every penny in the value they bring to the organization.
  2. Don’t turn analysts into reporters.
    This one happens all too often. We hire brilliant analysts and then ask them to spend all of their time pulling and formatting reports so that we can do our own analysis. Everyone’s time is misused at best and wasted at worst. I think this type of thing is a result of the miscommunication as much as a cause of it. When we get an analysis we’re unhappy with, we “solve” the problem by just doing it ourselves rather than use those moments as opportunities to get on the same page with each other. Web Analytics Demystified‘s Eric Peterson is always saying analytics is an art as much as it is a science, and that can mean there are multiple ways to get to findings. Talking about what’s effective and what’s not is critical to our ultimate success. Getting to great analysis is definitely an iterative process.
  3. Don’t expect perfection; get comfortable with some ambiguity
    When we decide to be “data-driven,” we seem to assume that the data is going to provide perfect answers to our most difficult problems. But perfect data is about as common as perfect people. And the chances of getting perfect data decrease as the volume of data increases. We remember from our statistics classes that larger sample sizes mean more accurate statistics, but “more accurate” and “perfect” are not the same (and more about statistics later in this list). My friend Tim Wilson recently posted an excellent article on why data doesn’t match and why we shouldn’t be concerned. I highly recommend a quick read. The reality is we don’t need perfect data to produce highly valuable insight, but an expectation of perfection will quickly derail excellent analysis. To be clear, though, this doesn’t mean we shouldn’t try as hard as we can to use great tools, excellent methodologies and proper data cleansing to ensure we are working from high quality data sets. We just shouldn’t blow off an entire analysis because there is some ambiguity in the results. Unrealistic expectations are killers.
  4. Be extremely clear about assumptions and objectives. Don’t leave things unspoken.
    Mismatched assumptions are at the heart of most miscommunications regarding just about anything, but they can be a killer in many analyses. Per item #3, we need to start with the assumption that the data won’t be perfect. But then we need to be really clear with all involved what we’re assuming we’re going to learn and what we’re trying to do with those learnings. It’s extremely important that the analysts are well aware of the business goals and objectives, and they need to be very clearly about why they’re being asked for the analysis and what’s going to be done with it. It’s also extremely important that the decision makers are aware of the capabilities of the tools and the quality of the data so they know if their expectations are realistic.
  5. Resist numbers for number’s sake
    Man, we love our numbers in retail. If it’s trackable, we want to know about it. And on the web, just about everything is trackable. But I’ll argue that too much data is actually worse than no data at all. We can’t manage what we don’t measure, but we also can’t manage everything that is measurable. We need to determine which metrics are truly making a difference in our businesses (which is no small task) and then focus ourselves and our teams relentlessly on understanding and driving those metrics. Our analyses should always focus around those key measures of our businesses and not simply report hundreds (or thousands) of different numbers in the hopes that somehow they’ll all tie together into some sort of magic bullet.
  6. Resist simplicity for simplicity’s sake
    Why do we seem to be on an endless quest to measure our businesses in the simplest possible manner? Don’t get me wrong. I understand the appeal of simplicity, especially when you have to communicate up the corporate ladder. While the allure of a simple metric is strong, I fear overly simplified metrics are not useful. Our businesses are complex. Our websites are complex. Our customers are complex. The combination of the three is incredibly complex. If we create a metric that’s easy to calculate but not reliable, we run the risk of endless amounts of analysis trying to manage to a metric that doesn’t actually have a cause-and-effect relationship with our financial success. Great metrics might require more complicated analyses, but accurate, actionable information is worth a bit of complexity. And quality metrics based on complex analyses can still be expressed simply.
  7. Get comfortable with probabilities and ranges
    When we’re dealing with future uncertainties like forecasts or ROI calculations, we are kidding ourselves when we settle on specific numbers. Yet we do it all the time. One of my favorite books last year was called “Why Can’t You Just Give Me the Number?” The author, Patrick Leach, wrote the book specifically for executives who consistently ask that question. I highly recommend a read. Analysts and decision makers alike need to understand the of pros and cons of averages and using them in particular situations, particularly when stacking them on top of each other. Just the first chapter of the book Flaw of Averages does an excellent job explaining the general problems.
  8. Be multilingual
    Decision makers should brush up on basic statistics. I don’t think it’s necessary to re-learn all the formulas, but it’s definitely important to remember all the nuances of statistics. As time has passed from our initial statistics classes, we tend to forget about properly selected samples, standard deviations and such, and we just remember that you can believe the numbers. But we can’t just believe any old number. All those intricacies matter. Numbers don’t lie, but people lie, misuse and misread numbers on a regular basis. A basic understanding of statistics can not only help mitigate those concerns, but on a more positive note it can also help decision makers and analysts get to the truth more quickly.

    Analysts should learn the language of the business and work hard to better understand the nuances of the businesses of the decision makers. It’s important to understand the daily pressures decision makers face to ensure the analysis is truly of value. It’s also important to understand the language of each decision maker to shortcut understanding of the analysis by presenting it in terms immediately identifiable to the audience. This sounds obvious, I suppose, but I’ve heard way too many analyses that are presented in “analyst-speak” and go right over the heard of the audience.

  9. Faster is not necessarily better
    We have tons of data in real time, so the temptation is to start getting a read almost immediately on any new strategic implementation, promotion, etc. Resist the temptation! I wrote a post a while back comparing this type of real time analysis to some of the silliness that occurs on 24-hour news networks. Getting results back quickly is good, but not at the expense of accuracy. We have to strike the right balance to ensure we don’t spin our wheels in the wrong direction by reacting to very incomplete data.
  10. Don’t ignore the gut
    Some people will probably vehemently disagree with me on this one, but when an experienced person says something in his or her gut says something is wrong with the data, we shouldn’t ignore it. As we stated in #3, the data we’re working from is not perfect so “gut checks” are not completely out of order. Our unconscious or hidden brains are more powerful and more correct than we often give them credit for. Many of our past learnings remain lurking in our brains and tend to surface as emotions and gut reactions. They’re not always right, for sure, but that doesn’t mean they should be ignored. If someone’s gut says something is wrong, we should at the very least take another honest look at the results. We might be very happy we did.
  11. Presentation matters a lot.
    Last but certainly not least, how the analysis is presented can make or break its success. Everything from how slides are laid out to how we walk through the findings matter. It’s critically important to remember that analysts are WAY closer to the data than everyone else. The audience needs to be carefully walked through the analysis, and analysts should show their work (like math proofs in school). It’s all about persuading the audience and proving a case and every point prior to this one comes into play.

The wealth and complexity of data we have to run our businesses is often a luxury and sometimes a curse. In the end, the data doesn’t make our businesses decisions. People do. And we have to acknowledge and overcome some of our basic human interaction issues in order to fully leverage the value of our masses of data to make the right data-driven decisions for our businesses.

What do you think? Where do you differ? What else can we do?

Retail: Shaken Not Stirred by Kevin Ertell


Home | About