Friday’s guest post on Sean Stannard-Stockton’s Tactical Philanthropy blog by Bob Ottenhoff, CEO of Guidestar, introduced a new tool to help assess nonprofit effectiveness. The guest post and Sean’s post today generated a few interesting comments; Charting Impact certainly heralds a positive direction in improving how we assess and communicate impact. That said, I still wonder how far we are from being really able to best align our charitable giving and investments with the most effective nonprofits.
A few thoughts / comments:
1. Will the information presented by a nonprofit be current and real-time? How often will the information be updated? My ongoing concern about assessing metrics is that the data points commonly lag (commonly by 12 – 18 months) the current environment of an organization. Back in September I wrote:
“Similarly, performance outcomes are also a look back and don’t provide a full real-time overview of the organization. For example, let’s take an organization that has had a tremendous success in delivering services and has generated a track record of success. One could presume that an investment in that organization will generate additional high community impact. But that organization’s track history and performance outcomes won’t reveal that it is a dysfunctional internal mess that managed to achieve strong performance metrics despite its limitations. Nor will it tell you if those achievements were attained because of a visionary and talented person at the organization who has since left that nonprofit for a new enterprise.”
I believe those comments still stand and need to be addressed. The five questions in Charting Impact may still not uncover the true real-time nature of the organization.
2. Charting Impact conducted a pilot test of the framework with 39 organizations last summer. How did the results of the pilot test correlate with nonprofit ratings already published by other groups? Were there any outliers or did the results validate data already publically available?
3. How will the five questions be presented to potential donors or investors? If donors or investors do not easily understand the data, many will not make the extra effort to research and obtain clarifications.
4. I have to disagree with Sean that “the quality of the answers will be a stronger indicator than the information cited in the answer.” As with reviewing proposals requesting funding, we can’t be blinded by beautifully written prose or even poorly written responses. The information cited in the answer – whether well or poorly written – should carry tremendous weight in our analysis.
Bookmark/Search this post with: