Participant-led Research and Policy Change Part I

I still remember the day, about a month into my employment at El Centro, Inc., when my then-boss, Richard Ruiz, turned to me in his office and said, “There was a day when I knew every single person at Latino Summerfest (a carnival and community celebration El Centro used to cohost in the Argentine neighborhood). Now, our community is changing so quickly, I don’t feel like I even have a good grasp on what our clients need, what their families are like, what their issues are. How can we really find that out, in a way that we have some sense of certainty?” Being new on the job and eager to please (and not having any idea how much the following sentence would change my life over the next 6 years), I offered, “I took several research classes as part of my MSW. I could do a survey.” Needless to say, he loved the idea, and I soon found at least two full work months a year occupied by this intense process of collecting, analyzing, and disseminating information about the Latino immigrants with whom we worked.

It was 2001 when we first started, and there was quite an appetite for local information to supplement the Census data just beginning to trickle out. The Census had confirmed what Richard had observed in his own neighborhood–the Latino population increased dramatically in the Kansas City area between 1990-2000, and Latinos were also moving into new parts of the community, especially in the suburbs. In 2001, as today, then, the Latino ‘community’ in the area was far from homogenous, with very new immigrants, long-time native-born U.S. citizens or multi-generational U.S. residents, and everyone in between. Our research agenda came from a desire to better understand those we were serving, especially to prioritize our services as we often felt pulled in many directions, and also to define a niche for our organization in providing high-quality, relevant, recent information about a growing and rapidly-changing demographic. I knew, though, that I wanted to do this research a bit differently than might be expected; I wanted the participants to have a real part to play, and I wanted some commitment that we would act on the findings in a meaningful way.

Below, I sketch the process that we went through to conduct our research, as an example. This research has also been published in a few places, most completely in an article in Families and Society in 2008. Tomorrow, I’ll include links to the original survey instrument from the 2006 study as well as the report that came from it, along with lessons learned.

If your organization is considering a research project to better understand your client population or another facet of your work, where are you in this process? What resources do you need to get started? How can you ensure that your research participants will have a real voice in the project as it unfolds? What do you really want to know? How will you give people control over their own data/stories? I would love to take a look at your research instruments, help you think through data collection techniques, and brainstorm ways to creatively disseminate the information. It is definitely a significant undertaking to do research this way and to do it well, but it was also one of the most meaningful parts of my work, and one of the most lasting legacies.

El Centro, Inc.’ Survey Process:
In June every year, we would begin to draft the survey. The first year, that was a wild process–everyone from Board members to clients to staff to donors to volunteers had questions that they wanted to be included, and it was a lot of work to trim it down. In subsequent years, we worked with the previous survey as a starting point, cutting out questions that had yielded little helpful information, adding questions to help us make sense of ambiguous results from last year, and taking the document back to our administration, client leadership, and direct-line personnel. We also tested the survey with a representative group of respondents, often before or after an ESL class, to address problematic wording and change any questions that were confusing. Then, the copying began. To have a better sense of diversity among our clients, we color-coded the surveys, using a different color of paper for each El Centro, Inc. program and each external survey site.

The survey administration started in August and ran through the end of September. The first year, I think that we only surveyed about 200 people, but after that, our goal was always around 900 respondents. The bulk of our respondents were surveyed at El Centro or partner agencies, but, in an effort to understand if service recipients were in significant ways different from those not connected to agencies, we also surveyed people at some community locations. This took A LOT of time. Because we wanted to be available to survey respondents as they filled out the survey (because they might have questions sparked as a result of filling it out), someone was present while they completed it. Onsite, that wasn’t too hard; we included it in the intake process, as it only took about 20 minutes to complete, the time that clients would often wait, as drop-ins, to see a case manager. At churches or community gatherings or other locations, though, this meant an organizer (usually me) had to go, explain the process, request participation, and stay while people completed it. It was a great opportunity to talk about our organization, though, get people interested in our work, and probably bring in some new clients (although we could never be sure, because the survey was anonymous). I trained our front-line staff, as well as staff at our partner organizations, in survey administration; this required going through the survey with them to address questions and explaining the confidentiality procedure (including what to do with surveys before I came to collect them).

The data coding and entry started in earnest in October. We did all of the entry in SPSS (Statistical Package for the Social Sciences). I did most of it myself, honestly, because it required someone fluent in English and Spanish (data all in Spanish, but entry in English) and also proficient in data coding. We used SPSS both because I was familiar with it and because we could use it for free at the local university; purchasing it would have been prohibitively expensive.

When the data entry was complete, the analysis began. This meant not only conducting the actual statistical tests (both descriptive and inferential) but also researching the context of the data, including what parameters looked like in other regions and how these respondents compared to non-Latinos. This was really important to us, because we wanted to be able to help interpret the data for our survey respondents and our audiences.

We produced three main products from this research and analysis: a full survey report (linked below for 2006), a Powerpoint Presentation that summarized the main findings, and a Spanish-language presentation that included not only the most significant findings but also additional questions that were sparked by the results. We made the report available on our website and also shared it electronically (and, in a few select cases, primarily for donors and policymakers) with allies and other targets. It was also used quite extensively by students, organizations preparing projects and grants for work in the Latino community, and even by policymakers directly (we had some great victories when politicians cited the report in debates or other proceedings). The Powerpoint presentation was presented, especially in the early years, to any group who had an interest in learning more about our community; the main audiences were law enforcement, educators, faith groups, civic organizations, and other social service agencies. In 2005, 2006, and 2007, we generated some earned revenue from presentations to stakeholders in January, charging $5/person or $25/agency to those interested in learning more about our research and results.

Where I think our process was particularly different is the way in which we made sure to ‘close the loop’ with presentations to survey respondents, in multiple venues, throughout the month of November. We asked for their help in interpreting some of the data, paid respondents to participate in more in-depth focus groups, and, really, began the process here of thinking about changes for the next year’s research.

Tomorrow (in a MUCH shorter post), I’ll include some of our lessons learned and the links to the documents produced, as examples. Please let me know if you have questions!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s