Yesterday I wrote about how non-profit entrepreneurship support organizations (ESOs) play games with the numbers they report to showcase their success. Today, I continue that conversation but focus on how foundations do the same.
Foundations fund the ESOs that we talked about yesterday so, naturally, they benefit from all of the games that the ESOs play. As ESOs measure meaningless metrics, loosely define metric definitions, potentially double and triple count clients, and rely on questionably accurate data these numbers are reported to the foundations. When foundations don’t question these numbers, then the silliness that gets reported at the ESO level simply rolls up into all of the foundations’ reports and gets pushed out to an even wider audience.
On top of these issues, however, foundations also have an additional method of fudging their numbers: duplicative counts take 2.
In addition to the potential for duplicative counting of clients within an individual ESO described yesterday, many entrepreneurs also work with more than one ESO and/or CDFI receiving grant money from the same funding organization, which is also tracking its own aggregate performance metrics. Yet, there is not enough communication between the ESOs’ and CDFIs’ systems to recognize the potential duplicative count. So, in the example with Melody I gave in my post Why Typical Entrepreneurship Support Success Metrics are BS, not only is it possible that her story was logged as a success, it’s possible that it was logged as a success by the two ESOs she works with and the CDFI.
Let’s say Melody’s contract was $500,000 and the loan was $100,000. If the two ESOs and the CDFI are all funded by the same foundation and all report Melody’s numbers in their metrics, that funding organization will show records stating that its donations helped win $1 million in contracts ($500,000 reported twice, once by each of the ESOs) and $300,000 in loans ($100,000 reported three times, once by each of the ESOs and once by the CDFI) for a total of $1.3 million versus the $600,000 that it actually was.
Thus, by the time you’re seeing a report from a foundation about the impact of the grant dollars it’s deployed, there are so many issues with the data that you really can’t tell at all if that foundation’s support has actually moved the needle at all or not.