VolunteerMatch’s Zeph Harben was part of a panel on data-driven decision making at the 2011 Nonprofit Technology Conference in Washington, DC. He wrote the following article for the NTEN Blog to summarize some of his thoughts on the topic.
By Zeph Harben
Every day we encounter hundreds of decision points. Thankfully, most of them are quite small.
- Should I have a bagel or Wheaties for breakfast?
- Should I wear jeans or khakis to work?
- Should I watch “American Idol” or try to get through a few pages of Finnegans Wake?
Ideally, most of these will be snap judgments. We follow our habits or rely on intuition. We allow the choice to bubble up without losing much time agonizing over the outcome. The human mind, it turns out, is an extremely efficient manager of the chaos of life.
However, patterns that work so well on small decisions often fail us when it comes to the big stuff. While it may be obvious that programmatic strategy – decisions that affect your coworkers, your volunteers, or your organization’s mission – shouldn’t be based on “snap judgments”, it’s incredibly hard to make the human mind work any other way. All too often, the decisions that arise from going with your gut turn out to be mistakes that set you back weeks or months.
The key to making sound, big decisions is to ground them in real-world, vetted, unprejudiced, unadulterated facts. Working in I.T. for the last 8 years, most recently as Director of Application Architecture at VolunteerMatch, I’ve had some experience working through mistakes. While gut-based decision-making is as natural as choosing your breakfast, I’d like offer some tips on how to recognize the patterns, and ultimately change the way your organization makes strategic decisions.
Cognitive Bias: The Invisible Elephant in the Room
There are big reasons why organizations struggle to avoid driving strategy with instinct, so to create a culture that encourages data-based decisions, leaders must first recognize the cognitive biases that influence decisions.
What’s a cognitive bias?
Wikipedia has a great primer on the topic for amateur organizational psychologists. As you read it, you’ll begin to spot the biases that your team suffers from.
In the case of VolunteerMatch, we’re not immune by a long shot. As an application service provider with roots that stretch back to the early days of the Web, we’ve been through several phases of change and development. Sometimes we’ve led trends. Sometimes we’ve lagged behind them.
Here are a few examples:
The Bandwagon Effect
In 2008, we launched Facebook Connect, a “single-sign on” method that allows our visitors to use their Facebook credentials to log in. Like many Web services, we were following the social media trend without realizing how much our perspective was skewed by bias of the Bandwagon Effect.
We could see that the internet was deluged with “me-too” Facebook implementations, and we lacked clear data on the potential impact of the change. Our discussions at the time centered on the need to “do something”. We charged ahead, investing months in the integration.
The project had some positive outcomes – we fixed an archaic registration and login process, and Facebook Connect kicked off a larger social media effort that is still paying dividends for VolunteerMatch today. However, in terms of our key metrics we saw little impact from Facebook Connect. While there was a ton of buzz about Facebook at the time, our particular implementation fell flat with visitors.
Three years and much market research later, we now know that providing better integration for sharing tools (e.g., AddThis, Facebook Activity Widget, etc.) has been the key to enhancing social activity and has increased our presence on Facebook and elsewhere. Thankfully, these other data-led investments have finally paid off for VolunteerMatch.
Here area few tips to avoid your own Bandwagon Effect:
- Perform a pilot study. Take a ride on the bandwagon, but be ready to jump off.
- Collect clear data on the impact of several distinct approaches to the problem you are trying to resolve.
- Use A/B testing methods to test a single variant in your workflow.
- Test small, incremental changes instead of overhauling everything. Be prepared to return to old methods if it’s not working.
- Avoid using marketing hype as a substitute for your own analysis.
- Try, and be willing to fail, to prepare your team for future success!
Status Quo Bias
Status Quo bias takes many forms, but in simple terms it involves fear of change. In organizations, the Status Quo bias can emerge from a strong cultural identity, or static models of your constituents and their needs. If you’ve ever heard the phrase “This is the way we’ve always done it,” you’ve encountered the Status Quo bias.
At VolunteerMatch, the Status Quo bias recently infected our approach to attracting visitors. After many years of stellar growth in visitors, we observed that website traffic was flattening. Our decisions related to this trend reflected a strong bias towards Status Quo.
For years, VolunteerMatch grew by providing users with Google-like tools to search, segment, slice and dice our volunteer opportunities. But internet behavior is changing – people find opportunities today through content curation and sharing, and they expect information to be presented in a contextual, localized format like on Yelp or Google Places.
At first, we focused on small-scale improvements to our registration process and search tools. We reasoned that if we simply adjusted the old formula, we would return to our previous rates of growth and success metrics.
However, visits continued to stagnate.
Finally, with extensive research and data analysis, plus some help from an SEO consultant, we were able to see how outmoded our assumptions about the way people search for volunteer opportunities had become.
Today we’re developing functionality around localized content, and ensuring that our volunteer opportunities are more discoverable by search engines that put primacy on local content.
The results so far? Extremely positive – our success metrics and website traffic have been improving steadily.
Ride the Wave, Not the Bandwagon
Over time, we’ve been able to align our strategy more closely with new trends and changes in the competitive landscape. This hasn’t been easy, but by countering cognitive biases with data analysis and an incremental approach to change, we now have a sense of “riding the wave” — instead of riding the bandwagon.
Zeph Harben is the Director of Application Architecture at VolunteerMatch. Previously he was Director of Information Technology for TRUSTe.