Building a Culture of Strategic Data Use
Social impact leaders face a daunting paradox.
On the one hand, they are operating in an increasingly demanding environment, as foundations are requiring that organizations submit more rigorous evidence of impact. On the other hand, organizations face declining revenue and have limited internal capacity for collecting and using data. Over the past several years, we’ve worked with organizations wrestling with these very real challenges. Below are three lessons we’ve learned about how evaluation can be used to maximize impact.
Lesson #1: Strategy matters.
There’s a lot of talk about data collection methods and analytic procedures – What is the best mode for administering surveys to get a strong response rate? How do we isolate the impact of a program? While these types of questions are important, until organizations have developed a clearly articulated strategy, the answers won’t have much meaning. Organizational leaders should start any data collection effort by creating a program/policy logic model and ensuring that the data they’re collecting aligns to the indicators that matter most. Program logic models depict the theory and assumptions underlying a program, policy, or strategy by linking outcomes (both short- and long-term) with activities and processes.
- Learn more about how to develop a logic model here.
Lesson #2: Evaluation should balance rigor with relevance.
Organizations often contract with external evaluators to assess program implementation and impact. This level of independence is important for maintaining objectivity and bringing a rigorous third-party perspective to evaluate the impact of investment. However, a desire for rigor should be balanced with the need for relevance. Organizational leaders must often make decisions in the face of incomplete information. Unfortunately, research typically concludes that there is not enough information to draw conclusions. To be relevant, evaluation needs to offer recommendations, even if the evidence base is only descriptive.
- See an example of one of our evaluation reports on Talent Practices in Delaware.
Lesson #3: Data can be a powerful communication tool.
Evaluation reports contain a wealth of information about the effectiveness of policy and program implementation. However, they don’t generally express complex ideas with clarity or precision and instead overwhelm the reader with statistics. Neuroscience research has shown that individuals make decisions based on emotion, not logic. Indeed, research by Stanford Professor Chip Heath has shown that 63% of people can remember stories, but only 5% can remember a single statistic. Organizational leaders should use research and evaluation data to tell their story of change.
- See an example of how we used our Year 1 Evaluation of the School District of Philadelphia’s School Redesign Initiative, to create a video communicating the impact of the initiative: http://www.schoolredesignphiladelphia.org/about.html
Come learn more about how we support organizations develop a culture of strategic data use at our launch celebration on Tuesday November 29th at Industrious: https://impactedphl.ticketleap.com/launch-celebration/.