Measuring Success

by Katie Stokes-Guinan |

Katie Stokes-Guinan is director of programs and quality control for Grail Family Services. She has worked closely with Arrupe Center students for four years.

Three Santa Clara University students, two young women and one young man, waited nervously at the entrance to my classroom, unsure of how to begin working with children in our literacy program. They did not go unnoticed by the first, second, and third grade students who looked up at the intruders with curiosity.

Before they arrived, these three Applied Sociology students studied socio- logical evaluation methods, familiarized themselves with the literacy program’s curriculum, and were trained to implement a literacy skills evaluation. Now they were faced with using these skills to evaluate the impact of my program.

I welcomed the SCU students and invited one, a friendly-looking girl named Jessica, to follow me as I approached a normally very outgoing second grader named Juan. As we came near, Juan turned from staring wordlessly at the SCU students to examining the tops of his shoes, his hands in his pockets.

“Juan,” I said gently, crouching down to his eye level, “I’d like you to meet Jessica. She really likes stories, and she would love to read one with you. Is that okay?” Juan shyly nodded his head, and I sent him and Jessica off to conduct an assessment of Juan’s literacy skills.

An innovative collaboration
This exchange, awkward though it may have initially felt to the partici- pants, was part of an important and innovative collaboration between Grail Family Services (GFS) and Santa Clara University.

Founded in 1995 and formerly known as San Jose Grail Development Corporation, GFS fosters learning and the empowerment of low-income families in San José’s multicultural neighborhoods through the delivery of programs that educate, develop leadership skills, and build a sense of community. GFS is recognized as an incubator of high-quality, community-focused, results-oriented programs that respond to residents’ needs.

One of our most successful programs is the Children’s Best for Achievement (BEST) After School Literacy Program. Our connection to and collaboration with SCU enabled GFS to gain broader recognition for the program from local schools and the Alum Rock Union School District, all of which have called BEST an exemplary program that fosters learning, meets its stated objectives, and has a measurable impact on children’s literacy skills. Our program is among the few to have a structured curriculum with clear objectives and a significant evaluation component, which SCU students helped us develop and refine, and this fact helped us to gain attention and funding.

The GFS-SCU collaboration began in 2000, bringing together one small, start-up non-profit which had yet to fully establish itself in the community, with one well-established, well-respected university. The collaboration has provided GFS valuable assistance with several aspects of the BEST project, while SCU students have had hands-on opportunities to learn skills in data analysis, observation, and evaluation, as well as ample opportunity to work with children.

In fall 2000, GFS was busy laying the foundation for a community-wide early literacy initiative. Its flagship program was to be an eight-week after school program with a literacy focus for first, second, and third graders. I was hired to help develop and implement the program, as well as to evaluate the program’s impact on the participants. Since GFS was a fairly new non-profit at the time, I had virtually no resources—no money, no staff, and no evaluation instruments—with which to conduct the evaluation. Regardless, GFS under- stood the need for program evaluation and was committed to finding a way to make it happen. After all, without a rigorous evaluation, how would we know if the program worked? How would we show prospective funders that our pro- gram was making a difference in the lives of the children it was serving?

In the midst of discussions about how to best evaluate the program, we received a phone call from Laura Nichols, assistant professor of sociology at SCU. Through a referral from the Arrupe Center, she contacted us to explore our interest in partnering with her Applied Sociology class during the winter quarter of 2001.

Nichols was searching for organizations that could provide her students, in the course of a 10-week quarter, with hands-on experience in learning how to do research that helps organizations work better. The timing couldn’t have been more perfect. It was agreed that GFS would create and implement an evaluation tool, then provide the data to the Applied Sociology students who would analyze the data, evaluate the results, and produce a report based on their findings.

Our first collaboration was a success on many levels. The SCU students got a sense of what it looks like to implement a pre- and post-evaluation instrument in a real organization. Both sides learned to expect the unexpected and be resourceful problem solvers when dealing with unanticipated data. And GFS learned that our first literacy skills evaluation tool was in need of an overhaul. The feedback and suggestions from the SCU students showed that our group testing method did not provide a wholly accurate assessment of each child’s skills, as some of the children copied their answers from their neighbors. Additionally, we realized the need to make tighter links between the program’s objectives and what the evaluation tool was measuring. We decided to re-write the evaluation tool and continue the collaboration with Nichols’ next Applied Sociology class the following year.

In the meantime, we continued to strengthen the program’s curriculum and search for funding to continue the program. Our observations of the children showed that the program was having a positive impact on their literacy skills. Parents and teachers alike were making positive comments about the progress they had seen in the children. We were encouraged, but still wanted objective, empirical data to demonstrate our results.

Improving the Tools
The next winter, we again tackled the challenge of evaluating the BEST program, this time armed with the wisdom and insights from our first experience. Incorporating suggestions made by the first group of SCU students, we rewrote the evaluation tool to allow us to test the children individ- ually to get the most accurate picture of their skills. I met with the next group of SCU students before they came to my class to provide a crash course in working with children. In the evaluation process, each child was asked to read a short story and then answer some questions about the story. The SCU students timed each child’s reading speed and noted how well they answered the questions to get a sense of their comprehension skills.

This time, the collaboration was stronger and the outcome was better. GFS got empirical data about our program, and the SCU students had first-hand experience in implementing an evaluation tool. But they also grappled with important sociological considerations such as how to ensure the validity of the measures, create a positive testing environment, and ensure uniformity in the scoring process.

During the third year we further improved our original evaluation tool, and also developed an additional tool to measure the children’s motivation for reading. SCU students piloted the tool with the children in the BEST program, revised the tool based on their pilot test, and again collected pre- and post-test data and analyzed the results. The final product was a written report clearly showing that children in the program increased their reading speed, reading and listening comprehension, and their interest in reading over the course of the eight-week program.

GFS got empirical data about our program, and the SCU students had first-hand experience in implementing an evaluation tool. But they also grappled with important sociological considerations such as how to ensure the validity of the measures, create a positive testing environment, and ensure uniformity in the scoring process.

With a report written by outside evaluators demonstrating the positive impact of the program, we approached some prospective funders with an eye toward expanding the program to additional sites. These reports, written by SCU students, proved to be instrumental in helping GFS secure funding to expand the BEST program to a second site so we could double the number of children served.

The Future Looks Bright
The BEST program has evolved tremendously over the past four years, thanks in large part to help from students from SCU. With SCU stu- dents, GFS developed and implemented three separate evaluation tools to measure such indicators as the increase in children’s literacy skills, the increase in children’s motivation for reading, and the impact of the program on the children’s parents.

SCU students working through the Arrupe Center have helped ensure a brighter future for hundreds of East San José elementary school children. And for many SCU students, it all started when they met with a shy second grader in my classroom.

For more information on GFS, see www.gfsfamilyservices.org.

comments powered by Disqus

From the director

Download