The Ivory Tower and Social Entrepreneurs
You’re out there doing good. So you believe. But how do you know? And how do you show (especially to your funders)?
The issue of impact measurement is a critical and inescapable one in our field. On Social Edge, we have an entire blog devoted to it. I’m not an expert in the intricacies involved (I hated Statistics 101 and I barely know what a regression is today), but I do know that when I work with my clients on strategic planning, marketing, and fundraising, the thing I always wish we had more of is good, solid data demonstrating that the program is actually accomplishing what we believe it is.
The issue has been a problematic one in the relationship between nonprofits and foundations. Funders understandably want to know that their investment is paying off and want their grantees to offer proof. But nonprofits – especially the smaller ones - often can’t afford to do the kind of impact measurement that carries statistical validity, leading them to grumble under their breath, “If you want it, why don’t you fund it?”
Funding is required because obtaining statistically valid impact is so difficult to prove. Yes, you can say that your youth program had 30% of the participants go to college. But is that really more than would have been the case without your program? And if it’s more than the average in the wider population, how do we know that you’re not just cherry picking the best youth for your program? How do we know it’s your program that’s actually having the impact versus some other factor? Getting the control group right, running double blind experiments, and other stuff that Statistics 101 requires is complicated, time consuming, and expensive.
Which is why the vast majority of nonprofits never do it.
That’s a big problem for them – and not just because of funders. When social interventions go forth without good data, there is a huge potential for effort that is sub-optimal, wasteful, or even downright counter-productive. Anyone remotely familiar with reform in any field – education, international development, youth, etc. – knows how many expensive programs have been tried in history that have turned out to be utter failures. It is very, very easy to believe you are out there doing good, and be very, very wrong.
We need the data but we can’t afford to obtain it ourselves. So what’s the answer?
I believe this is where social entrepreneurs and foundations need to be much more proactive about establishing partnerships with the one sector that is ideally suited for data collections: academia. Social scientists are trained and motivated to measure and prove, and they are looking for interesting problems to address.
One example of this is the MIT’s Abdul Latif Jameel Poverty Action Lab (J-PAL). It is founded and led by the MacArthur Fellow (the “genius grant”) Esther Duflo (right). The lab specializes in measuring the impact of various programs in the field, and has made very influential contributions to the field.
If you go to the Lab’s excellent site, you’ll see some conclusions that may even run counter to some of the assumptions behind the efforts of social entrepreneurs who write for our site! I won’t get into the specifics myself (the issues are obviously complex), but my point is that we have to take this issue of measurement very seriously. None of us wants to be doing work that is ineffective, and when we fly blind – even partially blind - we run a high risk.
Obviously, we can’t get a MacArthur Fellow to study our own particular program. But we can work at getting someone to do so. A social scientist just starting out, a grad student, even an undergraduate statistics class would all be good starts. Here, in the Bay Area, there are many academic institutions that would be receptive to some sort of project, but most nonprofit leaders just haven’t made a proactive effort to explore a partnership. If nothing else, nonprofit leaders should be familiar with the studies out there and be aligning your efforts accordingly.
We in the nonprofit sector need to take advantage of this openness. Leaders in our sector – especially those in foundations – should be establishing stronger links with the academic sector. This happens at the highest levels, where the Gates, Rockefeller, Ford foundations of the world sponsor sophisticated and expensive studies of large programs. But we need to develop solutions at the mid and small level as well. There is growing openness in academia for engaging in highly practical research of even small programs. Duflo’s lab studies very small interventions usually, and she just was awarded the John Bates Clark Medal, the most prestigious award given to young economists (and often a predictor of the Nobel Prize).
We have to try. The truth is out there.