Updating Rules and Measures
Like the telegraph, the fountain pen and the horse-drawn buggy, IPEDS is a relic of a time gone by, in the view of many community college leaders. But rather than being put on some museum shelf amid praise for what it once did so well, IPEDS — the Integrated Postsecondary Education Data System — has stubbornly clung to its status as the chief manner in which colleges are measured.
Three leading community college groups are aiming to change that.
The Association of Community College Trustees, the American Association of Community Colleges and the College Board have embarked on a yearlong effort to satisfy a goal shared by numerous college leaders — devising a new set of metrics that will accurately reflect the work under way on community colleges campuses.
“What we are trying to do is get a clear statement on how the value of community colleges will be determined,” said ACCT President J. Noah Brown. “If you want policy makers to stick with you through tough times, you really need more than just anecdotal evidence.”
The initiative represents an effort by community college leaders to seize control of the debate over accountability, which has been roiling since the 2006 release of the final report of the Bush Administration’s Commission on the Future of Higher Education.
It also reflects a consensus shared by community college leaders — that current government metrics, focusing largely on graduation rates, are woefully inadequate in reflecting what is happening on community college campuses.
Gail Mellow, president of LaGuardia Community College, believes that IPEDS misses the mark when it comes to measuring community colleges.
While it slices and dices data by gender and ethnicity, IPEDS focuses primarily on graduation rates, measuring the proportion of first-time, full-time students who earn a degree within 150 percent of the normal time, three years for an associate degree and six years for a bachelor’s degree.
“IPEDS is laughably inadequate,” Mellow said. “By measuring first-time, fulltime students, you are measuring 14 percent of the students who attend LaGuardia. The problem is that IPEDS never changed as the proportion of students attending community colleges has continued to climb.”
Diane Auer Jones, a former U.S. assistant secretary of postsecondary education and former community college professor, agrees.
“IPEDS are dreadfully inadequate for community colleges, and really other colleges, except for the elite institutions,” she said.
Community colleges contend that IPEDS is based on the antiquated notion: that higher education takes place primarily in a residential school and students are mostly fresh out of high school and between 18 and 24 years old.
That hardly describes the average community college student, who is apt to be older, going to school part-time, working and economically disadvantaged.
“Graduation rates are important, but the question is whether we can come to a larger version of the community college mission,” said Ronald Williams, vice-president of the College Board. “We don’t want to avoid those things we are having trouble with, but we hope to capture the full vale of community colleges.”
Organizers of the effort are mindful that policymakers often cite low graduation rates as evidence that community colleges are failing.
“I think we are all frustrated that the current data does not fairly measure us,” said George Boggs, president of the American Association of Community Colleges. “We don’t feel that IPEDS promotes a very good understanding for policymakers. If we come up with some measures, we’ll be able better explain what we do.”
Efforts to devise alternative measures are not new, though the drive to develop a series of metrics to which community colleges would voluntarily subscribe is.
The Bush administration’s efforts to pursue a unit-tracking system — which would track students educational across various institutions and into the workplace — were stymied by Congress. Private four-year institutions complained that such a system would represent an intrusion into student privacy.
Not everyone buys that argument, however.
“It was very easy for the elite institutions to fly the flag of privacy,” Jones said. “But I think the real reason is under the current model, that data is biased toward the privileged few.”
States Take the Lead
Meanwhile, several states have devised their own measures for community colleges. In North Carolina, for example, community colleges are measured on eight benchmarks, including passing rates of students in developmental classes, the academic performance of transfer students, passing rates on licensure and certification exams, student retention and progress and client satisfaction with customized training programs.
Texas, Florida and Indiana track completion rates for as long as six years and differentiates between fulltime and part-time students.
Achieving the Dream: Community Colleges Count, an initiative spearheaded by the Lumina Foundation for Education, tracked community college students in six states as part of a study of alternative community college metrics.
The study included both part-time and fulltime students. It characterized students as successful if they made significant progress toward a degree or transferred to a four-year school, even if they didn’t earn an associate degree.
The study altered the definition of success. In Florida, for example, graduation rates for full-time students jumped from 19 to 35 percent when the time frame was extended from three to six years.
The leaders of the current initiative hope to build on the work under way in states, Brown said. One of the first steps the organizers took was to commission a survey of what states across the country are doing in devising alternative measures of community colleges. Based on that data, the group plans to form several task forces to drill deeper into the data.
“We are not trying to reinvent the wheel,” Brown said.
The ailing economy is fueling the effort. As revenues decline, community college leaders are feeling particularly vulnerable to state budget-cutters.
“The economy really has added to the impetus that has existed for years,” Brown said. “We are behind the 8-ball. The economic situation certainly has accelerated the desire to get this done.”
Brown said he hopes that any alternative measure system will address the work community colleges do in workforce development. That’s a key part of the community college mission, and one that current measures simply do not capture, he said.
“We want a system under which we can demonstrate what we are doing in the area of workforce development,” he said. “I think it will enable policymakers to see us as a workforce development tool and an investment in the economy.”
Said Mellow: “The Education Department measures nothing about workforce development. It’s a critical part of our mission, and it’s invisible. There is zero data on the contribution to workforce development. That is a gaping hole.”
Supporters of alternative measures are sensitive to potential criticism that they could be accused cherry-picking those measures that make community colleges look good and downplaying those that point where they come up short. No one is advocating ignoring graduation rates, but merely augmenting them in data collection.
“One of the things we have to bear in mind is that any accountability system can’t avoid the things where are having trouble,” said Williams. “What we want is a more sophisticated look at community colleges.”
Boggs believes that new metrics will have the added payoff of identifying the weaknesses of individual community colleges, allowing them to take steps to improve.
“We know have to do better job when it comes to graduation rates,” he said.
Boggs and others acknowledge the difficulty in devising a new system of metrics. Many colleges, especially smaller ones, lack the capacity to do the kind of research that is needed.
Moreover, there is disagreement as to what exactly are the best measures.
Jones, for one, opposes the notion of standardized tests.
“I do worry about voluntary standards that involve standardized tests,” she said. “I think that it is easy to start gaming the system when tests are used as the primary assessment tool. Already I feel that SAT and GRE tests, and the related prep courses, are a problem in terms of the cost as well as creating an uneven playing field. Poor kids can’t afford the prep courses, not to mention that the ‘beat the system’ prep courses defeat the whole purpose of using tests to assess a student’s long term learning potential.”
Mellow would like to see a measure that puts a dollar value on a community college education.
“We should have a price per student,” she said. “We spend one-third of what four-year colleges spend. So it would be interesting to see how many tax dollars are expended per student. The first thing should be economic value.”
Judith Eaton, president of the Council of Higher Education Accreditation and a former community college president, believes any new measurements must be aimed at helping students and parents discern between colleges.
“People want information by which they can make a judgment about the likelihood of success if they attend a certain institution,” she said. “People need to have confidence that they are likely to have a good result if they attend a particular college.”
A Template for Measuring Community Colleges?