by Tom Corbett
Finally, the report identified several categories of research questions. At the broadest level were a set of issues reflecting the well-being of the low-income population at large and for subpopulations of interest. This fit very well with my general fascination with social indicators. Second, there were a set of policy and programmatic questions new to welfare evaluations. Unlike the era of a federally dominated cash-transfer system, we now had a proliferation of new approaches across the country. We needed studies of the wide variation in strategies and approaches that were emerging at the local level. We also needed to examine the changes in programs and approaches outside of the formal TANF system since reform had stimulated a lot of innovative cross-systems thinking. It was no longer just about welfare anymore in these new one-stop mega agencies being developed.
Finally, we needed to fully explore and explicate a comprehensive list of research issues and match them up with appropriate methods. Up to this time, welfare evaluations had predominantly stressed random assignment experiments, the gold standard for doing impact evaluations. This had been true to the virtual exclusion of other methods.
Now, however, we were expanding the scope of interesting questions and thus had to carefully consider whether other analytical approaches might prove more appropriate to the nature and character of the underlying concern or contextual circumstances. Sometimes, for example, you are interested in assessing the effects of a bundle of changes including altering the culture of program agencies. In such cases, traditional experimental methods are likely to prove inadequate or infeasible. You need to drift toward time-series modeling and comparison-group designs using carefully constructed comparison group populations composed of ineligible populations. In extreme cases, monitoring the use of social indicators or administrative proxies (benchmarks) for the real outcomes of interest might be the best we can do. This is hard to do well but necessary until we come up with more rigorous methods at some future point.
The report also spent a lot of time examining national and sub-national data sets and exploring what would be required to develop an adequate data infrastructure to monitor the well-being of those populations we cared about. This became increasingly important as we devolved responsibility for their care to officials located closer to where help was delivered…the local community. One passage tapped my concerns quite well, where reform was characterized as a “moving target” which is “still evolving.”
…. states, having largely accomplished their caseload reduction goals, are now turning their attention to the provision of services to poor families, in general, and to women and families who are not receiving welfare. The provision of work supports, such as child care, as well as services meant to address other problems and barriers women experience in attempting to reach self-sufficiency, are widely discussed. Welfare reform is a continuing, dynamic process as states gradually confront new problems and face new challenges. The energy in this evolution is an indication of a system that is constantly trying to improve itself, which is clearly desirable, but it makes the problem of evaluation quite difficult.
All in all, I thought the tone of the report responsive to the concerns I had raised throughout. I had pushed the notion that reform was spinning off in ways that were creating dramatically new forms and strategies for helping not only traditional welfare clients, but new populations not previously served by traditional cash assistance programs.
The videos I had shown the panel contained many interviews of local and state officials talking about this brave new world reformers wanted to create. Given this refreshing sense of freedom at the state and local level, the rush to innovate with great imagination and commitment could be seen, spurred on through mechanisms such as WELPAN. It was a perfect world for me if only I had created this new academic niche called institutional ethnography. More on that inspirational concept in just a bit. Ah well, we get too old smart.
There is one thing you can count on with any welfare issue, even the boring or technical ones. Someone will be mad at you. The Manpower Demonstration and Research Corporation (MDRC) probably was the premier evaluation firm of welfare innovation. They were totally committed to the classic experimental design and viewed some of the report’s findings with alarm. Howard Rolston, the chief proponent of experiments within Health and Human Services, also was concerned about the tone of the committee’s work. He was sitting next to me at some meeting one day when a comment must have reminded him of the expert panel’s report. He suddenly gave me a whack on the arm. “Ouch,” I responded, “what was that for?” he responded with something like, “That is for influencing the committee so much.” I protested that I was one small voice among many, but to no avail. Apparently, some folks had an exaggerated sense of my influence in the world.
Being a member of a NAS panel is not exactly a lucrative undertaking considering that serving on one can get you whacked on the arm. Your only reward for providing this service is doing something for the public good which most members are certainly willing to do. Still, they feed you well and they do throw in a few other perks into the pot, like an emergency travel number. At one meeting, a hurricane was moving along the Atlantic Coast brushing alongside D.C. All day I watched the winds and rain increase and kept calling the emergency travel number they gave me. “No, Dr. Corbett, your flight back to Madison is okay,” said a cheerful voice on the other end of the line. I protested that I could see the trees bending at a ninety-degree angle outside my window, but the cheerful voice insisted that all was okay.
Of course, when I get to National Airport, it is a ghost town. The few people there looked at me as if I were a total lunatic, a common enough error. “There is a damn hurricane out there, what the hell are you doing here?” said the bemused person behind the ticket counter. So, now I figure that there is not a hotel room within fifty miles and call the emergency number I was given. “Dear caller, we are sorry to inform you that we are flooded out temporarily,” said a soothing voice on the other end of the line that sounded eerily like that cheerful voice which had recently assured me that all was okay. But there was yet another number at the end of this long message informing me of their woes. Fortunately, the voice that answered this call was still above water level and eventually did manage to get me a room for the night. I still had questions about this so-called perk, but all had ended well in the end.
Another night our deliberations were done for the day and a group of us left the NAS headquarters near the D.C. mall. We were heading off to an upscale restaurant located near Dupont Circle. Now, I mentioned earlier that Becca Maynard and I did squabble a bit, mostly in fun. Well, all in fun, really. The only person I know who really hated me was Governor Thompson. Becca and I knew the town best and told the group to follow us. Then Becca immediately turned one direction and I the other. “The restaurant is not that way,” I said. “Yes, it is!” she responded. So, we wound up standing in the middle of the street arguing while this group of smart people waited for us to decide which way was up. After she and I wagered twenty dollars on which of us knew the correct way, I managed to convince everyone to follow me. How I prevailed is a wondrous event since I never, ever win arguments with those of the female persuasion. Now that I think on it, I don’t believe she ever paid off the bet. Maybe I should give her a call?
There is a fourth act to this drama which both preceded and succeeded the expert’s panel deliberation. Even before national welfare reform, Bobbi Wolfe and I started looking at how the changing character of welfare might challenge the evaluation community. This last act began picking up speed in 1995 when devolution began to look like a runaway train and welfare was already being decentralized through an aggressive policy of granting waivers to the states of various federal rules and regulations. Just about any state with some crackpot idea could get permission to try it out.
I casually started tracking major studies and research initiatives out there, mostly to keep atop the lay of the land. I sensed that the research community was feeling some angs
t about their neat welfare world falling apart. After all, a lot of evaluation money was being thrown around. At some point, Bobbi and I concluded that we should bring leading evaluators together for a chat, not always the easiest thing to do since many were competing with one another for resources. It is a little like getting Verizon, Sprint, and AT&T to cooperate freely with one another. However, I knew by this time that people would come even if only to make sure that they missed nothing which might advantage the other guy.
So, in February of 1996, IRP and the National Center for Children and Poverty (NCCP) organized a rather large national conference in D.C. on the future of research and evaluation in anticipation of the anticipated transfer of responsibility for welfare policies and programs to the states and even local communities. The conference was financially supported by ASPE and the Foundation for Child Development (FCD) which Barbara Blum headed at the time. It was in putting together the conference that I first met Barbara, a relationship I would treasure until she retired from public life.
The mood at the conference, which brought together many of the premier academic and evaluation firm researchers in the country, was palpably pessimistic. Many felt that the glory days of welfare evaluation and research were about over as the locus of policy split into fifty or more separate fiefdoms. Of course, dire predictions don’t always come true, even when made by very smart people. For one thing, only cash assistance would be turned into a block grant later that year. The remainder of the social safety net went relatively untouched. And second, the philanthropic community would pour some one hundred million dollars into finding out what might happen if radical reform became reality while the federal government would spend millions more. Besides, the better talent within the research community always comes up with more research questions, even if they are not always on the mark.
This gathering generated a lot of useful conversations about emerging research and evaluation challenges. It may well have contributed to ASPE supporting the formation of the NAS panel a year or two later. More people began to think hard about the uncertainties in detailing the character of new program and policy interventions (particularly those that cut across traditional silos or program lines). Researchers became more cognizant of a need to identify additional outcomes of interest now that they were exploding beyond the usual suspects (poverty reduction, work, and independence) and to be more careful in determining the correct unit of analysis for each study. On this last issue, one could see going from individual to family to multi-generational entities to the community and who knows where else.
One spin-off from the dialogue initiated at the 1996 IRP gathering involved bringing together foundation officials for a presentation and discussion. Bobbi Wolfe and I put the event together with the help of Eric Wanner who headed the Russell Sage Foundation at the time. My memory is that Larry Aber, then at Columbia University, was involved as well. He had to be since I clearly recall meeting him at his office prior to the larger session. At that main session, we provided an overview of changes we saw occurring in the nation’s safety net along with our perspective on the challenges such changes posed for the evaluation community at large. It was a kind of big strategic planning meeting designed to get interested members of the philanthropic world on the right track. What made this gathering memorable for me was that it was held at the World Trade Center which, just several years in the future, would only be a memory.
Another outcome from the large 1996 gathering of the research community clan was to identify a subgroup that really thought we needed to focus on what we called process or implementation analysis. Barbara Blum and I pulled that group back together the next year and collaborated with them through conference calls and workshops. All this effort and thought culminated in the publication of an edited work titled Policy into Action: Implementation Research and Welfare Reform that finally came out in 2003. It was put out by the Urban Institute and was edited by Mary Clare Lennon, an associate of Barbara Blum located at Columbia University, and myself. Other contributors included Demetra Nightingale and Pamela Holcomb (Urban Institute), Irene Lurie (SUNY-Albany), Lawrence Mead (NYU), Evelyn Brodkyn (U. of Chicago), Kathryn Edin (Northwestern at the time), Catherine Born (U. of Maryland), Robert Goerge (Chapin Hall, U. of Chicago), Tom Kaplan (U. of Wisconsin), and Rebecca Maynard (U. of Pennsylvania). In addition, two practitioners added contributions, Joel Rabb from the Ohio and Don Winstead formerly from Florida but working at Health and Human Services in Washington at the time.
The volume was a leap forward in discussing what it would take to unravel the dynamic complexity of evolving welfare systems through advanced implementation evaluation methods. The genie was clearly out of the bottle by this time and would not be stuffed back in. The question was…what did this genie look like?
There was a straightforward rationale for focusing on implementation analysis. We knew there was a lot of talk about what states and localities were saying they wanted to do. This goes back to the old AFDC waiver days when it seemed every governor was stepping before the cameras announcing the he or she would be ending welfare as we knew it. A key question always lurked in the background…did they do what they had said they were going to do? Was the character of their actual interventions anywhere close to the rhetorical posturing? Did the fidelity of their implementation approximate their initial vision to any degree? Our volume on assessing the implementation of new ideas was a step toward seeking methods for answering such questions.
As I said before, too many times perhaps, I wish I had developed this new area of study called institutional ethnography. I know that someone will now tell me that they have been teaching this for years at St. Mary’s of the Celestial Swamp College. You can even get a Masters in the field. But no one in the poverty arena called themselves one of these that I recall at least. And few of the top poverty researchers I knew spent much time in real agencies talking to real officials. Yes, Evelyn Brodkyn, Irene Lurie, Larry Mead, Michael Wiseman, and Sandra Danziger (and colleagues) did spend time in the real world and I am sure a few other academics along the way. Most implementation investigations were being done to support an impact analysis and were not considered to be serious contributions. I am confident those really interested in getting inside organizations to look beyond the surface did so in disguise and always worried that their colleagues would view such an activity as this with great suspicion. Why is this alleged academic not doing real academic work? What in God’s name are they doing out looking at real agencies and talking with real people?
I remember a chat with Jason Deparle once. Many of you will recall that Jason was THE New York Times reporter doing welfare and social safety net articles in the 1990s. He was writing a book on several women affected by W-2 at the time. At some point in our conversation, I suggested he might someday want to write a book about how the welfare bureaucracies were changing and speculate what the future might hold. I could tell the topic held no interest for him. People would want to read about other people, especially if suffering were involved, not about what they assumed would be bloodless bureaucracies.
On another occasion, the head of the British Bureau of Pensions and other things was visiting Wisconsin both to see the Wisconsin-Works program and to stop by the Institute. He would have been like the equivalent of the Secretary of Health and Human Services or the head of Social Security in the States. After his talk at an evening session at IRP, I asked what about the Wisconsin reforms had impressed him the most. He did not mention the falling caseloads or the new work requirements or the services to strengthen families or any of the other changes most observers might choose to cite. What he talked about was the physical layout of the new welfare offices in Wisconsin. He was taken by the fact that they did not look like traditional welfare offices at all. They literally invited people inside with their warm and professional ambiance. I found it fascinating that this is the message he took away from Wisconsin. Of course, perhaps he was being nice to a known liberal audience and did not wish
to reveal all the ideas for slashing the dole back in Britain he was taking away.
At the same time, his observation uncovered a deeper insight. When cash welfare transitioned into a work program, perhaps it was permissible to spruce up the physical layout. You wanted people to come in the door and get their life on the so-called right path. Society sanctioned those programs designed to make people independent. On the other hand, they despised hose programs that appeared to abet dependency. When you get program purposes right, a lot of other things fall into place including the layout of your physical plant.
The bottom line is that I did spend a lot of time in agencies talking to people who design, manage, and work in these systems. I suspect it really is true that you should go out and look at stuff if you want to see anything, if you want to get beyond a surface understanding. Yogi Berra was smarter than we thought. I learned as much putzing around in the real world as I did sitting in the research presentations at IRP or in the many academic conferences I attended. You learn different things in each setting but they all have value.
I could see the foundations of welfare and social assistance shifting as most of my colleagues were yet grappling with yesterday’s theoretical and management questions. I sensed that many of my peers were doing ‘rear view’ mirror research, focusing on the past rather than what might be coming up just around the next corner. My “rearview mirror” analogy was a term Barbara Blum loved and would often use herself. I could turn a phrase.
I tried to express what I was seeing in articles, book chapters, Focus pieces, and WELPAN reports throughout the late 1990s and the early years of this century. At the risk of boring readers to tears, I will summarize some of the things I saw as briefly as possible. There were several macro trends going on during this period, three of which I mention here are found in the Lennon-Corbett work Policy into Action: