Tag Archives: research

Everyone likes pretty pictures

Map of Ex-Offender Employment Options, from http://www.spatialinformationdesignlab.org

So it’s been established that I am very spatially challenged. Still true. My oldest son spends practically the entire time that we’re at our local petting zoo just looking at the map that shows the place, and he calls from his carseat in the back of our van, “Mommy, why are we turning around again?” whenever Mommy is trying to go to a new place.

And he’s only three. Sigh.

But, I can certainly appreciate a great map, and how this kind of visual presentation can help people to connect with data in new ways, and, most importantly, expose new patterns and new insights that can transform what we do with data, too.

That’s why I’m so excited about NonprofitMapping. It’s a project that is collecting data about the impact of the current recession on the nonprofit landscape across the country, and, interestingly, actually rating states based on the quality of their information about the nonprofit sector. When states can see themselves on a map as being deficient in what they collect and disseminate about the existence (and, hopefully, soon, the effectiveness) of nonprofits in their state, it should lead to a more systematic examination of the sector, that can only serve our interests of better defining our value and articulating our impact. I don’t totally agree with the premise that the loss of nonprofits in a given area is necessarily a bad thing, and I even have hope that some of the reduction in numbers of nonprofits during this recession is a sign that less-effective ones are fading away and that good programs are finding ways to consolidate and grow to scale, but I am completely on board with their open-source strategy and ambitious goal of changing how we think about and look at nonprofit organizations nationwide. For social work advocates, who have such compelling stories to tell and, increasingly, even good data at our fingertips, the next challenge is to figure out how to tell those stories in powerful and visual ways, in order to resonate with today’s audience.

One of my favorite parts about NonprofitMapping, then, is not even their own work but their “blog roll”, of sorts, where they list other awesome mapping projects, like the Justice Mapping Center (criminal justice and policy information), and the completely awesome Map4Change (‘discover’ injustice by looking at maps like this one, and then click on their site to take action and find groups of others concerned about the same issues!!!).

The folks at MapTogether will definitely be hearing from me–they provide free map-related training and tools for nonprofit and community groups. They have a free guide to GIS and online mapping tools that even I can understand. They’re also working to make maps more accessible, including braille and audio features.

In today’s crowded information environment, social justice advocates need every advantage to help our appeals receive the attention they deserve. This is especially true in the context of the transparency movement, which will (assuming it succeeds) increase the amount of information available and increase, therefore, the importance of finding ways to make sense of the data. We may never be into orienteering like my husband and son, but we can rely on these outstanding partners to help us better use the rapidly-evolving tools at our disposable to, pardon the pun, put our issues on the map.

And, while you’re at it, there’s this neighborhood where my son’s friend lives that I ALWAYS get lost in…

Maybe duplication isn’t such a bad thing (?)

We hear it all the time, right? From grantmakers and politicians and our own executive directors: Thou Shalt Not Duplicate Services.

It’s one of those things that, on its face, seems like the most reasonable prohibition in the world. We have limited resources and so many problems, so why in the world would we want to duplicate what someone else is already doing?

Some of my thinking about organizational effectiveness and how we measure social work impact has got me thinking about this in a new way, though.

Because, the truth is, there are many social problems where we’re really not making much progress. Whoever is occupying that field, so to speak, could apparently use some help figuring out the best way to attack the problem. Who’s to say that they’re not the wrong people to solve it (even if they did get there first), and that your approach might not be better?

At its core, I think a lot of the concern with duplication of services is that we’re still, too often, measuring “services”, rather than impact. And if your only deliverable is “an after-school program” rather than “increasing literacy rates among adolescent males” or something more, well, real, like that, then of course it’s going to be concerning to see several different programs all offering after-school programs, because it would arguably be more cost-efficient to centralize those resources.

But, in that scenario, the problem is with what we’re tracking (and not), not with the duplication itself. After all, if literacy rates are still lower than they should be, a funder or other interested party would be hard-pressed to argue that your organization couldn’t work on that social problem because “it’s already being taken care of.” It’s not.

But how do we make the case for innovative approaches to social problems, when there’s this preoccupation with avoiding what looks like duplication?

The key, I think, is to address the problem at its core: doing a better job of articulating the value we bring to the social endeavor, instead of talking about our outputs because it’s what we have figured out how to measure.

The book, the Wisdom of Crowds, talks about this in a roundabout way, providing evidence that multiple actors (most often in the scientific arena) conducting parallel experiments on the same problems, leads to richer understanding of the questions at hand and far greater confidence in the results. To achieve those gains, of course, we’ve got to be rigorously evaluating our interventions, demonstrating their impact against the baseline of the social problem, and making those results available to others committed to the same outcomes, so that we can learn from and adopt the best interventions.

But those are things that we should be doing for our own understanding, anyway, so that we’re sure that we’re headed in the right direction and likely to reach our destination. And, if that’s the case, then I think we’ll be able to make the case to the “no duplication” crowd that, after all, there’s an advantage in having traveling companions.

What do you think? Would such a shift away from the “no duplication under any circumstances” policy siphon off valuable resources? Is there too little overlap between the hard sciences and what we do in social work for these parallels to be useful? What have been your experiences with best practices work and outcomes research in social services? Funders, how would you respond to an organization making a case like this?

Sorry, social workers: Statistics Matter

I stumbled upon (and I mean that literally, not in the social bookmarking sense!) this New York Times article on statistics, and it prompted me thinking about how important it is for social work students, and recent graduates (and even us not-so-recent ones) to understand, critically analyze, present, and use statistics effectively in our macro (and, I would argue, also our micro) practice.

Think about it. As the article points out, there is a literal abundance of data available today, and, in fact, there are often too many data points for people to really make sense of, which can (and, unfortunately is) sometimes result in a gut reaction that spurns all logic and makes decisions instead from an ideological, even anachronistically anti-factual basis. For social workers, what this sometimes means is that, because we don’t understand statistics as well as we should (or just don’t trust our own understandings), we either attempt to make our cases entirely based on emotional appeal or fall victim to others’ flawed or deliberately misleading analyses.

Both are real dangers to our credibility and efficacy and, more importantly, to the quality of advocacy work we can do on behalf of those we serve.

We do our best advocacy when we can collect good information on the social problems we aim to solve and the strengths that our constituencies bring to those endeavors; analyze those masses of data in ways that value human realities; and help decisionmakers to use the data to make good social policy decisions.

I don’t know if it will ultimately be as revolutionary as the legendary, ‘plastic’, but emerging social workers would certainly be well-served by turning some of their intellectual energies to statistical analysis. That means that we social work educators have to find creative, engaging, relevant, accessible ways to communicate statistical expertise to students. Given how many of my students say (only half-jokingly, I think) that their dislike of math was part of what drove them to social work, I recognize that this won’t be easy, but important things seldom are.

Research Part II

You made it through yesterday’s marathon post about participant-led research? And you still want to know what we learned throughout this whole process? Here are some thoughts that might be helpful to other organizations, whose primary task is not research but who want to layer a research agenda onto their practice and advocacy work. Below I’ve linked to the survey instrument we used (it’s in Spanish, of course), the 2006 survey report, and also a report from a youth survey that we did, also in 2006, at the request of some of the parents among our client leadership as well as our grassroots youth leaders.

  • Be accountable to your survey participants–for us, this meant continually trimming the survey so that it took as little of their time as possible, bringing the results back to them for feedback, and also taking the results into account in our own programming and advocacy decisions. We started citizenship classes because they were cited as a need, and we ranked our policy priorities as a direct outcome of the survey.
  • Be militant about confidentiality–I only allowed people to administer the surveys using standard color ink, for example; we locked the surveys in file cabinets; when it was time to shred the originals, I did it myself. People need to know that you are taking maximum precautions. We were also very clear that people had the right to refuse to participate, and that nonparticipation IN NO WAY impacted their eligibility for our services. This may have reduced our numbers, but it is the only way to ethically conduct research.
  • Use your data fully–once we had this rich resource, we used our results in our legislative testimony, grant applications, conversations with policymakers, even interviews with potential staff hires! Don’t overstate what the data tell you (be honest about its limitations), but acknowledge its full value, and be creative in its application.
  • Be timely–I worked like crazy in September and October to get the results ready, because, when you’re dealing with applied research, something is ‘old news’ after several months. The long lag between research and publication is one of the limitations of academic participation in policy debates, and we wanted to avoid it. It was funny, really, that by the time Families in Society went to press with the article, I had already left El Centro, Inc.!
  • Similarly, the context should drive some of your analysis, to make it relevant–we always added questions about how specific policies were impacting people’s lives, and also asked questions that we knew we could use for our policy campaigns (about goals to send children to college, for example, or having a driver’s license). In 2006 (see report below), the debate over immigration was raging, so we added some questions about people’s participation in the campaign and also their views on specific policy proposals being tossed around.

    Materials:
    Detrás del Debate–El Centro Survey Report 2006

    Youth Survey Report, 2006

    Survey Instrument, 2006

  • Participant-led Research and Policy Change Part I

    I still remember the day, about a month into my employment at El Centro, Inc., when my then-boss, Richard Ruiz, turned to me in his office and said, “There was a day when I knew every single person at Latino Summerfest (a carnival and community celebration El Centro used to cohost in the Argentine neighborhood). Now, our community is changing so quickly, I don’t feel like I even have a good grasp on what our clients need, what their families are like, what their issues are. How can we really find that out, in a way that we have some sense of certainty?” Being new on the job and eager to please (and not having any idea how much the following sentence would change my life over the next 6 years), I offered, “I took several research classes as part of my MSW. I could do a survey.” Needless to say, he loved the idea, and I soon found at least two full work months a year occupied by this intense process of collecting, analyzing, and disseminating information about the Latino immigrants with whom we worked.

    It was 2001 when we first started, and there was quite an appetite for local information to supplement the Census data just beginning to trickle out. The Census had confirmed what Richard had observed in his own neighborhood–the Latino population increased dramatically in the Kansas City area between 1990-2000, and Latinos were also moving into new parts of the community, especially in the suburbs. In 2001, as today, then, the Latino ‘community’ in the area was far from homogenous, with very new immigrants, long-time native-born U.S. citizens or multi-generational U.S. residents, and everyone in between. Our research agenda came from a desire to better understand those we were serving, especially to prioritize our services as we often felt pulled in many directions, and also to define a niche for our organization in providing high-quality, relevant, recent information about a growing and rapidly-changing demographic. I knew, though, that I wanted to do this research a bit differently than might be expected; I wanted the participants to have a real part to play, and I wanted some commitment that we would act on the findings in a meaningful way.

    Below, I sketch the process that we went through to conduct our research, as an example. This research has also been published in a few places, most completely in an article in Families and Society in 2008. Tomorrow, I’ll include links to the original survey instrument from the 2006 study as well as the report that came from it, along with lessons learned.

    If your organization is considering a research project to better understand your client population or another facet of your work, where are you in this process? What resources do you need to get started? How can you ensure that your research participants will have a real voice in the project as it unfolds? What do you really want to know? How will you give people control over their own data/stories? I would love to take a look at your research instruments, help you think through data collection techniques, and brainstorm ways to creatively disseminate the information. It is definitely a significant undertaking to do research this way and to do it well, but it was also one of the most meaningful parts of my work, and one of the most lasting legacies.

    El Centro, Inc.’ Survey Process:
    In June every year, we would begin to draft the survey. The first year, that was a wild process–everyone from Board members to clients to staff to donors to volunteers had questions that they wanted to be included, and it was a lot of work to trim it down. In subsequent years, we worked with the previous survey as a starting point, cutting out questions that had yielded little helpful information, adding questions to help us make sense of ambiguous results from last year, and taking the document back to our administration, client leadership, and direct-line personnel. We also tested the survey with a representative group of respondents, often before or after an ESL class, to address problematic wording and change any questions that were confusing. Then, the copying began. To have a better sense of diversity among our clients, we color-coded the surveys, using a different color of paper for each El Centro, Inc. program and each external survey site.

    The survey administration started in August and ran through the end of September. The first year, I think that we only surveyed about 200 people, but after that, our goal was always around 900 respondents. The bulk of our respondents were surveyed at El Centro or partner agencies, but, in an effort to understand if service recipients were in significant ways different from those not connected to agencies, we also surveyed people at some community locations. This took A LOT of time. Because we wanted to be available to survey respondents as they filled out the survey (because they might have questions sparked as a result of filling it out), someone was present while they completed it. Onsite, that wasn’t too hard; we included it in the intake process, as it only took about 20 minutes to complete, the time that clients would often wait, as drop-ins, to see a case manager. At churches or community gatherings or other locations, though, this meant an organizer (usually me) had to go, explain the process, request participation, and stay while people completed it. It was a great opportunity to talk about our organization, though, get people interested in our work, and probably bring in some new clients (although we could never be sure, because the survey was anonymous). I trained our front-line staff, as well as staff at our partner organizations, in survey administration; this required going through the survey with them to address questions and explaining the confidentiality procedure (including what to do with surveys before I came to collect them).

    The data coding and entry started in earnest in October. We did all of the entry in SPSS (Statistical Package for the Social Sciences). I did most of it myself, honestly, because it required someone fluent in English and Spanish (data all in Spanish, but entry in English) and also proficient in data coding. We used SPSS both because I was familiar with it and because we could use it for free at the local university; purchasing it would have been prohibitively expensive.

    When the data entry was complete, the analysis began. This meant not only conducting the actual statistical tests (both descriptive and inferential) but also researching the context of the data, including what parameters looked like in other regions and how these respondents compared to non-Latinos. This was really important to us, because we wanted to be able to help interpret the data for our survey respondents and our audiences.

    We produced three main products from this research and analysis: a full survey report (linked below for 2006), a Powerpoint Presentation that summarized the main findings, and a Spanish-language presentation that included not only the most significant findings but also additional questions that were sparked by the results. We made the report available on our website and also shared it electronically (and, in a few select cases, primarily for donors and policymakers) with allies and other targets. It was also used quite extensively by students, organizations preparing projects and grants for work in the Latino community, and even by policymakers directly (we had some great victories when politicians cited the report in debates or other proceedings). The Powerpoint presentation was presented, especially in the early years, to any group who had an interest in learning more about our community; the main audiences were law enforcement, educators, faith groups, civic organizations, and other social service agencies. In 2005, 2006, and 2007, we generated some earned revenue from presentations to stakeholders in January, charging $5/person or $25/agency to those interested in learning more about our research and results.

    Where I think our process was particularly different is the way in which we made sure to ‘close the loop’ with presentations to survey respondents, in multiple venues, throughout the month of November. We asked for their help in interpreting some of the data, paid respondents to participate in more in-depth focus groups, and, really, began the process here of thinking about changes for the next year’s research.

    Tomorrow (in a MUCH shorter post), I’ll include some of our lessons learned and the links to the documents produced, as examples. Please let me know if you have questions!