Evaluation Capacity that Sticks

In honor of Labor Day, and with some grieving for the end of my summer, I’m fully embracing the contributions of others this week.

It takes a village to come up with these blog posts, I guess?

One of my projects this year is an advocacy evaluation capacity-building initiative, in partnership with TCC Group.

I have been really excited to get to work alongside their consultants–having spent a fair amount of time in TCC webinars, to co-present on advocacy evaluation with them is a real gift.

Recently, TCC distributed an article about some of their learning, from this project and others, about how to build evaluation capacity that truly transforms organizational practices, adding net capacity that transcends the period of intense consultant engagement.

It’s something we’ve been talking about a lot in the Kansas context, too: how do we ensure that we’re not just swooping in to do some evaluation with and for these organizations but, instead, helping them to build knowledge and integrate structures that will enable them to take on advocacy evaluation in a sustained and effective way?

A few points from the article and from my engagement with this project, that resonate more broadly, I think, in the consulting and capacity-building fields in general:

  • Organizations have a lot to learn from each other: The organizations in the cohort with which I’m working clamor for more time with each other. Consultants don’t have a lock on knowledge, and not all capacity-building happens within the confines of the consultant-grantee relationship.
  • Learning needs immediate application: One of the challenges with our Kansas project is that it started in the fall which meant that, by the time that organizations had outlined their evaluation questions and begun to select instruments, it was the legislative session and they had no time to implement their ideas. Learning not applied can atrophy quickly, and we’re considering how to restructure the calendar for future cycles with this in mind.
  • We need to acknowledge the resource/capacity link: Of course it’s easy to say that the way we build capacity is to add dollars. Of course. And there’s obviously not a 1:1 relationship between, in this example, evaluation capacity and organizational budgets. But it’s also true that we can learn everything there is to know and still be crippled, in significant ways, by scarce resources, which means that true, sustainable capacity building in any area of organizational functioning has to also take into account how we build organizational capacity. Period.

I believe in the process of helping nonprofit leaders ask good questions about what they’re doing, the impact that it’s having, and what they need to change.

And I want to ensure that they are positioned to keep asking those questions after I move on.

To make a real difference, it has to stick.

10 responses to “Evaluation Capacity that Sticks

  1. This is such an interesting concept, and I’m glad to learn about this interesting initiative! It sounds like a great way to help organizations grow within themselves as well as a means to empowering organizations to begin and continue developing lasting change. I’m also curious whether this initiative helped develop community action groups or coalitions, and if so, whether those collaborations were sustainable.

  2. This particular project has ended, but it informed a lot of my thinking about how we’re approaching community-based evaluation and evaluation capacity-building with the Center, believing that the investment we can make in evaluating an organization is unlikely to have a lasting effect, if we aren’t helping them identify the questions they want to ask and build their skills to ask and analyze them themselves. In this case, these organizations were all organizations, not really coalitions, but we may be doing some coalition assessment from the Center in the coming year.

  3. “Evaluation capacity that sticks” – in my limited experience with evaluation at KU, I have watched administrators invest millions of dollars and countless hours on new, shiny tools for tracking and predicting students’ academic progress. While my role as an end-user of these new technologies, such as MySuccess, lacks much awareness of the overall acquisition process, there was no shortage of complaints or lack of engagement with the software. Ultimately, KU ended the contract with MySuccess (Starfish) and moved onto a new tool. I have wondered if, in hindsight, administrators learned from those mistakes (I’m sure these mistakes are more challenging to learn from when leaders do not stay long in their positions and/or do make an investment in their successors.). Perhaps, we’re all learning about how an institution can spend an exorbitant amount of money with little (perceived or otherwise) thought to the future–hello KU budget crisis. I appreciated your statement about building “evaluation capacity that truly transforms organizational practices, adding net capacity that transcends the period of intense consultant engagement”. It’s a challenge for me to imagine a university, especially one as large as KU, transforming organizational practices and adding net capacity throughout its work with consultants. This would likely be a radical approach for most business-model driven capitalists who wield the bulk of power in institutional decision-making processes.

    • It’s a challenge even in smaller organizations with less bureaucracy; too often, the emphasis is on generating and delivering ‘findings’ for an external consumer (usually a funder), rather than instituting practices and cultivating cultures that seek insights through careful and sustainable inquiry. It’s hard for so many reasons–sometimes, we won’t like what we find out, people are often threatened by the thought of asking questions about how well they’re doing, the power of ‘business as usual’ is potent. But I have seen some interventions that can help, including those that seek to distance the process of asking questions from judgments of the fundamental worth of the ‘subjects’, and those that give people opportunities to experiment with instruments and, then, make discoveries themselves. Thanks for your comment!

  4. Morgan Gragert

    This post has definitely made me think! One great thing that was highlighted in your blog post and in the original article is having nonprofits working together through cohort consulting. This allows nonprofits to not only learn new knowledge and skills but continue to support one another during the implementation process.

    • I literally had a meeting today with a funder who is thinking about this whole cohort approach to capacity-building and trying to strike what may be a somewhat elusive balance, between ‘matchmaking’ and telling people they have to work/learn from each other, and this one-off technical assistance approach, which has proven not to result in very lasting capacity. Which I guess goes to show that we’re still wrestling with some of the same challenges, years later!

  5. Colin James Frickey

    This is an interesting article. It is important for organizations to be able to take what they learn from consultants and be able to apply what they have learned, but also learn the tools and strategies for continued growth without outside consultant. It is important for organizations to learn from each other and work together. I really liked your point about the significance of implementing what is learned. If there is even a small amount of time from training or learning or whatever to whenever it is being implemented, there is going to be things forgotten. Implementation upon training is ideal.

  6. What have you observed about this in your own learning, Colin? What influences the extent to which learning ‘sticks’? What about in practice–what makes a difference there?

  7. This is an interesting article specifically the collaboration piece and working together and stabilizing support. I think this learning from others is a key influence in building skills. Also when you stated when can have all the knowledge there is to know about and topic and still be crippled. This is very powerful and speaks to how impact and growth is forever changing.

    • There’s a lot of research now, too, Josh, about how organizational culture helps to determine the extent to which people can really use the knowledge they have–whether innovations in practices will be encouraged, questions allowed, and risks accommodated. Honestly, I feel sometimes like the more I learn about organizational culture, the more it permeates nearly all our experiences with institutions. Thanks for your comment!

Leave a Reply to Brook Nasseri Cancel reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s