A question frequently puzzled over by mission driven leaders is, how do you measure mission impact? The answer is difficult to sort out. Indeed, challenging as it might be, measuring activity, efficiency and capacity is a cake walk next to measuring mission impact. Here are some thoughts as you set out to answer this important question.
Envisage the answer. Start with the organization's vision to remind yourself of what complete success looks like. When I was a litigator examining a witness, the notecards in front of me contained the answers I wanted to elicit, not the questions I intended to ask. Having the desired result in front of me, rather than questions, freed me from my preconceived notions of how the witness would respond and it gave me the flexibility to tailor the questions to the unpredictable tone of the environment. The same can work here. Instead of basing the questions on the programs and projects that are in place, begin by thinking about the fundamental problem the organization seeks to solve; form initial impact questions unrestrained by the existing programs.
1-2-3... What can be counted to determine if there have been gains in solving the underlying problem? Then drill down, drill down, drill down more. Do not let yourself or your organization off the hook on the hard metrics. Struggle to determine how to unearth un-manipulated raw data, apply rigor and establish causation between interventions and results. Make sure your organization is not rebranding input or output numbers as outcome or impact measures. Ultimately the measures have to be organization-specific, but sift for excellent ideas from other organizations with similar, relevant characteristics.
Remember the end game. Back away from the numbers and the frustration they might bring. No metrics are perfect. Every calculable result has multiple causes and not all can be controlled for. Many generally accepted measures are based on proxy values. Many established conclusions are based on inference. "Not everything that counts can be counted, and not everything that can be counted counts." Qualitative data is valid data. Conduct interviews, self-assessments and observations in a systematic way and then analyze that information along with the inevitably flawed, but oh-so-important quantitative data. Ultimately, synthesizing the collection of information in an intellectually honest way and being able to tell the authentic impact story is more important than the label on data collected.
Envisage the answer. Start with the organization's vision to remind yourself of what complete success looks like. When I was a litigator examining a witness, the notecards in front of me contained the answers I wanted to elicit, not the questions I intended to ask. Having the desired result in front of me, rather than questions, freed me from my preconceived notions of how the witness would respond and it gave me the flexibility to tailor the questions to the unpredictable tone of the environment. The same can work here. Instead of basing the questions on the programs and projects that are in place, begin by thinking about the fundamental problem the organization seeks to solve; form initial impact questions unrestrained by the existing programs.
1-2-3... What can be counted to determine if there have been gains in solving the underlying problem? Then drill down, drill down, drill down more. Do not let yourself or your organization off the hook on the hard metrics. Struggle to determine how to unearth un-manipulated raw data, apply rigor and establish causation between interventions and results. Make sure your organization is not rebranding input or output numbers as outcome or impact measures. Ultimately the measures have to be organization-specific, but sift for excellent ideas from other organizations with similar, relevant characteristics.
Remember the end game. Back away from the numbers and the frustration they might bring. No metrics are perfect. Every calculable result has multiple causes and not all can be controlled for. Many generally accepted measures are based on proxy values. Many established conclusions are based on inference. "Not everything that counts can be counted, and not everything that can be counted counts." Qualitative data is valid data. Conduct interviews, self-assessments and observations in a systematic way and then analyze that information along with the inevitably flawed, but oh-so-important quantitative data. Ultimately, synthesizing the collection of information in an intellectually honest way and being able to tell the authentic impact story is more important than the label on data collected.