EVALUATION

Here’s where we start:  What do you want out of the evaluation?  Not the funder. Not your board. Not your neighbor. You.

It’s not that the needs of these key stakeholders aren’t important.  They are.

It’s just that if you are reaching out to us, you most likely have a day-to-day relationship with the program, initiative, or organizational structure and there are some key questions you have about how to maximize impact.

We want to make sure that at the end of the day, you get what you need out of the evaluation.

 

Our Approach

The Process. Our evaluations are led by a primary researcher with evaluation expertise. This lead evaluator is supported by our team of senior consultants with complementary expertise in areas such as nonprofit infrastructure, law, finance, communications, technology, and strategy planning. 

Why? We go deep on evaluation expertise AND subject matter expertise to develop the best questions, analyze qualitative and quantitative data, and help your organization figure out your impact.

Although our team is well-versed in the various evaluation methodologies, we stay clear of relying on one methodology from the start. We carefully examine the problem or program to be evaluated and design an evaluation plan that will lead us to the information you need.

You may need formative or summative evaluations or a combination, and we can help you to understand the difference and use of each. We usually integrate qualitative and quantitative methods and, for the type of work our clients do, we lean towards the qualitative using interviews and focus groups. Often we apply the management-oriented systems model for clients seeking organizational change, because it places evaluation within the larger framework of organizational activities.

The Product. If the purpose of your evaluation is to inspire organizational or methodological change, and the findings are going to be shared with your board, funders, and key stakeholders, the evaluation report should be accessible and inviting; it should be something that people want to read. 

Not only will you get a quality analysis, but you’ll get a great package: an evaluation report with a clean, simple design and visuals; a slide deck; and a brief infographic that shares lessons learned in a user-friendly format.

Deeper Dive. We’d love to help you take the lessons and findings from the evaluation to figure out how to improve your work.  While the black box evaluation determines whether an intervention has an impact on outcomes, the theory-driven evaluation approach we prefer to use will assess outcomes and help to determine how and why the intervention works, integrating quality and methods questions, and, we believe most importantly, provides the information needed to improve programs. This approach can be useful to provide technical assistance in core areas or better align the evaluation with a strategic planning process that will use the evaluation to inform next steps and new direction setting.

The Costs Involved. Clients should typically estimate 10% of a project, initiative, or organizational budget for a thorough evaluation, although this is more of a general guideline or starting point than rule of thumb.  Costs, of course, vary.  The more sophisticated the design of the evaluation, the greater the cost. Randomized control trials and quasi-experimental designs that use control or comparison groups can be quite expensive, but such a study is rarely needed for organizational change. We are prepared to develop a design that meets your budget and your evaluation needs. If we provide additional technical assistance or capacity building supports, overall costs are generally more.

 

Are you interested in learning more about our evaluation portfolio?  Contact us at (708) 570-1606, 101 or email lmcgill@kempwhitfield.com.