Prior to joining WPP, I had the experience of working at an agency that was in the midst of trying to change from a traditional mindset to one more focused on digital and emerging technologies. What was fascinating to me was not how resistant our clients were to this change, but how flummoxed the internal culture was by these deviations from the ‘norm’. In ways that I could have never anticipated, I watched people react irrationally and illogically to the pending ‘threat’ of change, oftentimes based on assumptions that were never based in reality.
Pharma has this same issue. It’s terrified of change. I mean mortally terrified. And that presents a problem because, as we all know, change is inevitable. In ways you may not realize, the inherent terror that pervades the industry may be the reason why innovations fail to take root on a consistent basis.
Think about it. All innovations have change as the underlying common denominator. Since most people working in the digital, social or the technology fields are often asked to be agents-of-change, we deal with the results of fear-based risk-assessment all the time. Too often in that process, discussions will drift to analyzing the consequences of sometimes wildly speculative assumptions in order to determine what the worst-case scenario may be. When conversations head down the fear path, they are rarely productive.
So as innovators how do we overcome these sometimes-irrational responses? Are there better ways to evangelize new ideas? Will more data or facts help correct the problem? Hopefully better understanding the causes will better arm us all to more effectively drive change.
WE FEAR WHAT WE DO NOT UNDERSTAND
17 years ago, when I started in digital marketing, websites were a scary idea. I remember conversations in design reviews that went something along the lines of… “We need to make the text an image because we don’t want someone just copying it and pasting it somewhere else”, “How do I know the images won’t be on another site? Can we lock them?” or “I’d like some kind of password to protect it because I don’t want just anyone being able to see it.”
Discussions like this about a website may seem very silly now, back then the fear was palpable. Websites were new. The unintended consequences of using them were unclear. They seemed risky.
My grandfather used to tell me a story about when the first employee accessible telephone was installed at his offices at IBM. The floor manager, in an attempt to minimize the risk of such a device, placed it on a waist-high pedestal in the middle of the office floor where just about everyone could see it. His rationale was that there were just too many unknowns to account for, and as such, he wanted to be sure nothing would go awry. “What if employees start using it to make personal calls?” “What if a competitor calls and pretends to be a customer to steal our ideas?” What he was actually trying to articulate is “How will I manage the unknown risks from this change?” I’m sure all of you, if given a few moments to ponder, can think about instances in your own lives where you’ve inadvertently gone through this kind of thought process. Its part of being human, and partly we’ve been conditioned to think like this, often without our notice.
WE LIVE IN A CULTURE OF FEAR
The term A Culture of Fear is often used by certain scholars, writers, journalists, and politicians to define the belief that some in our society incite fear in the general public as a means of achieving particular goals. This is often achieved by giving disproportional attention to a particular subject or point of view. This can manifest itself as sensationalism, where a headline may be written to make the story appear more interesting or salacious, or can go as far as flat out manipulation of an audience.
To understand this concept in action, let’s look a recent example: Cyber-bullying.
A Google search for “cyber bullying” yields 16,900,000 results. Do a similar search on the term “bullying” and Google returns roughly 19,000,000 results. The natural implication being that “bullying” is generating much less content online than its cyber-oriented counterpart. The topic of cyber bullying has been given so much attention lately that articles detailing the subject have been written by the New York Times, PBS, ABC and others . Given this, you’d think that it’s an out of control epidemic that should at the least make people nervous, but at most garner some kind of Federal regulation to curb the problem.
But here’s the thing. According to the Department of Education, 28% of students reported being bullied in schools vs. 6% that reported being bullied online. The difference? School bullying happens in adult-controlled well-understood environments. Cyber bullying happens online. Parents are more afraid of the online space as it is an environment they don’t understand. Unfortunately, journalists and media outlets that push an over-abundance of these stories just to get attention are manipulating that fear.
Now think about social media being introduced to the risk-averse culture of pharma. The missteps, especially in this industry, have always garnered a disproportionate amount of attention, and are typically described in highly inflammatory language. This inherently conditions the uninitiated that these programs must be risky, and by extension, dangers lurk around every corner. After all, failure is much easier to describe and more interesting to read about than success.
THE ATTENTION ECONOMY MAKES THIS PROBLEM WORSE
Herbert Simon defines The Attention Economy as the phenomenon whereby the wealth and abundance of information flowing at us means that there must, by extension, be a dearth of something else. In this case the something else is our attention span. More information coming at us means we have less time to think and decide properly about the nature of what we see and read, especially when it’s new. You can see this in action in any comments thread online. The internet, it seems, runs on knee-jerk reactions.
In his book, Politics of Fear: Beyond Left and Right, Frank Furedi posits that the information-laden era we live in is one of the drivers changing our behavior to be more receptive to fear based manipulation. The domino effect of those changes follows a predictable pattern within cultures and groups.
The culture shifts its reaction to harm and mishaps
People no longer believe in natural disasters or acts of God. Someone or something is responsible, and therefore mishaps are redefined as preventable injuries.
Stuff no longer happens.
Which leads to a victim mentality
People are no longer expected to rise above adversity. As such, misfortunes haunt people and organizations and the ‘victim’ is viewed as someone who should have known better or seen this coming.
Stuff no longer happens and you could have stopped it.
Which leads to worst case thinking
The fear that any action, like inventing a new medicine, launching a new program or opening previously closed gateways, may have catastrophic consequences. Yes, but what if… thinking decreases the cultural capacity to deal with uncertainty.
Stuff no longer happens and you could have stopped it if you had just…
By its natural conclusion, risk becomes something to avoid, not an opportunity to be seized
Stuff no longer happens and you could have stopped it if you had just not done anything risky in the first place.
Therefore SAFETY, not success, becomes the ultimate goal.
The natural consequence of this progression is the cultural adoption of a concept known as The Precautionary Principle. This concept is defined as the demand on innovators to prove that their inventions will never cause harm before they are ever allowed to deploy or sell them. Roughly translated: If an action might cause harm, inaction is preferable.
Sounds like pharma right? Unfortunately…
OUR BRAINS ARE NO HELP EITHER
Cognitively speaking we’re terrible at assessing risks. Our brains, overloaded with information, have developed some rather bad habits, which in turn leads to some rather poor decision making on our part. These short cuts, or cognitive biases, have our brains distorting the perceptions about particular subjects, often with the result of coming to some erroneous conclusions. As innovators, there are three biases that most commonly work very, very, hard to short circuit our ability to think rationally about opportunities and new situations. Understanding these biases may unlock the keys to pushing groups or cultures forward into unknown territory.
The Availability Bias: This term describes how the prefrontal cortex evaluates risk based on the availability of similar situations it can readily recall when making decisions.
The prefrontal cortex assesses risk based on the information it has at its disposal. For instance, after 9-11 most people were afraid to fly even though statistically speaking air travel was never safer due to heightened security. Why? You couldn’t turn on the TV without seeing images of those planes tragically crashing into the World Trade Center or the Pentagon. Statistically speaking however, it was (and still is) much more dangerous to get in your car and drive to Target. Our brains see images of plane crashes more often than traffic accidents; therefore our brains evaluate air travel as far more risky.
If all one knows about a given situation is the examples of failure, rather than the successes, our brains will see nothing but red flags. This is important to remember when piloting new programs or ideas.
Anchoring: This term describes the common human tendency to rely too heavily, or “anchor,” on one trait or piece of information when making decisions.
Question for you. Deb is buying a camera online and needs a case to go with it. In which circumstance is she likely to buy a more expensive case?
A. When a website recommends a case to her as she’s purchasing the camera
B. When she realizes later that she needs a case
C. None of the above
The answer is C. Why? When presented with the relatively high cost of the camera, the cost of the case seems small by comparison.
With innovations, when all that can be cognitively referenced about a given idea are the dangers and consequences, the benefits of success will appear relatively insignificant by comparison.
Cognitive Backfire: This term describes the phenomenon that, when presented with facts that contradict deeply held beliefs; those beliefs become more entrenched, not less.
This one is my personal favorite. Loosely translated, facts don’t change beliefs. Read the comments sections for articles pertaining to President Obama’s birth certificate and you’ll see this in full effect.
Ever show someone tons and tons of data and yet they still seem to find random and trivial reasons to find reasons to hold the status quo or say no to your idea? Cognitive backfire in action.
Given all of these roadblocks it seems hopeless, right? As GI Joe taught us, knowing is half the battle. But, if you understand these principles and notice the phenomenon in action, you can actually make them work in your favor.
MY 5 STEP PROGRAM TO GUARANTEE SUCCESS*
- Align with core organizational beliefs
Every organization, no matter what size, has core beliefs or a credo of some variety that helps drive its cultural identity. The more you can align your innovations to those beliefs and tie them to organizational goals, the more likely you are to succeed. It may sound counter-intuitive, but often framing innovations through the lens of customer preference has the exact opposite effect you’re after. Instead, approach innovations like this… “If we say we’re a company that believes in X, shouldn’t we be figuring out how to do Y?”
- Socialize success
This gets back to the idea of anchoring. Based on everything you’ve read thus far, this should be a “duh” moment, but the more success you can seed within a group the more positive frames of reference there are to evaluate a potential idea.
- Change the language
Words mean things. To you a word like ‘comments’ or ‘sharing’ may seem perfectly innocuous, but to others, they may see red flags at the very mention. My favorite example of this comes from a program I was trying to get approved a few years back. The program was a blogger outreach campaign to raise awareness for the launch of a product. I attempted three different approaches to gain approval to no avail. One day I got the idea to change the name from Blogger Outreach to Online Journalist Briefing and the program was approved with no changes. To that group, the word blogger carried with it large amounts of risk. They understood what journalists did, and therefore the risks felt much less significant.
- Broaden the scope
With any major decision, the more minds you can put in a room, the better. By broadening the decision-making process you achieve several benefits. First, including multiple people in the final call minimizes the perceived personal risk for anyone saying yes. If anything does goes wrong, the consequences are distributed, not concentrated. Second, it also allows for a broader amount of viewpoints on the topic, which opens the possibility that peers who support an idea will sway potential naysayers. Third, even if you fail to get a new idea approved you’ve maximized the exposure to as many key leaders as possible, increasing the potential for other new ideas to be successful in the future.
- Pay a compliment
This ties into a concept I didn’t spend much time discussing here, called priming. If you visit Costco, the big screen TVs and jewelry aren’t at the front of the store by accident. They prime your brain to release endorphins to put you in a good mood. You may not buy the 100” flat screen TV, but fantasizing about owning it makes you happy, and when you’re happy, you buy more stuff. Believe it or not, if you open up a review meeting for your new idea by saying something nice about the people you need approval from, you’re odds of success go up noticeably.
While I’ve only just scratched the surface of these concepts, the behaviors and motivations they uncover can often be key to driving success for your innovations. Give them some consideration and report back how they’ve helped. I’d love to hear about your experiences. Until then, I recommend a great book on the subject, Jonah Lehrer’s How We Decide. It does a much better job of unpacking many of the subjects I’ve illuminated here.
*No actual guarantees provided