Analyzing The Data
Community colleges and other public, two-year institutions provide a broad range of personal, professional, and academic training and development opportunities to a diverse array of communities. A significant portion of those opportunities take the form of academic programs that lead to a degree, certificate or other formal award. It is this portion that we examine in our annual “fastest growing” analysis.
We’ve been tracking growth among U.S. public, two-year colleges in this annual analysis since 2001. This year, we introduce one significant change. In previous years we restricted our attention to colleges that are included in the “public, two-year” sector within the federal Integrated Postsecondary Education Data Set, known as IPEDS. However, as more community colleges add limited bachelor’s degree offerings, our restriction precluded an increasing number of institutions that serve the core community college mission. In this year’s analysis, we expand our domain to include the 37 institutions that fall within the “public, four-year” sector but are considered to be associate colleges within the Carnegie Classification system, because fewer than 10 percent of their annual awards are at the baccalaureate level or higher.
This change has a relatively modest impact on the four Top 50 lists included in this analysis: only five of the 200 listed institutions represent this added category and they are flagged to indicate their status. The change has a more notable impact on the trend data that we review in this introduction, after we consider in greater detail the sources, limitations, and interpretation of the analysis.
The Fall Enrollment Survey is part of the IPEDS data collection series administered by the National Center for Education Statistics, or NCES, a division of the U.S. Department of Education. The center contacts virtually every postsecondary institution in the U.S. and its protectorates and obtains a very high response rate. This response rate is stimulated by the potential penalty that institutions face for non-compliance—specifically, they can lose their qualification for enrolling students who obtain federal financial aid.
The IPEDS Enrollment Survey asks for fall semester enrollment and is collected during the spring semester. NCES administers the Web-based survey to more than 7,000 postsecondary institutions. Among the colleges included in this analysis, the response rate is very close to 100 percent...eventually. When we conduct our analysis, we use a preliminary release file because not every institution that will eventually reply has yet done so. However, for the type of institutions that we include, the vast majority have responded.
In order to enable reliable analysis of the data, NCES issues very specific definitions as to whom to include in the counts. Specifically, they ask institutions to report “all students enrolled in courses creditable toward a diploma, certificate, degree or other formal award.” In other words, they do not collect enrollments related to what is commonly called “non-credit” instruction. Since non-credit instruction is a very large component of many public, two-year college missions, it is important to note that the survey does not reflect a large portion of many of these college’s students.
Our Selection Criteria
Following a practice that NCES uses in its own reporting, we restrict our attention to the Title IV eligible institutions, that is, those that are accredited by either a regional or specialized postsecondary accreditation agency. We also consider only those institutions located in the 50 states and the District of Columbia, excluding institutions in Puerto Rico and other “outlying areas,” such as American Samoa, Formosa, Guam, etc. Finally, we exclude U.S. service academies, which typically offer courses in dispersed locations across the globe.
As already noted, we expanded our selection criteria this year beyond the “public, two-year” sector to also include institutions that offer limited baccalaureate programs but still fall within the Carnegie Classification category, “Associate Colleges.” Finally, we can only analyze growth for institutions that reported their enrollments for both the start (Fall 2005) and end (Fall 2006) points of our one-year time frame. These criteria yield a total of 1,181 colleges.
When we initially generate the data, we find emerging to the top of each list cases where the level growth seems suspiciously high, like an institution that doubles or triples its size in one year. We verify the data for these extreme cases by looking for corroborating sources. We contact individuals at these institutions, check their Web sites, or look at the data maintained by state and system offices that also survey the institutions regarding enrollment. If an institution informs us that the numbers they reported through IPEDS is incorrect or misleading, we remove them completely from the analysis.
There are several reasons why the data we see in the IPEDS survey may not represent growth authentically. First, there is human error, where incorrect numbers are reported in one year or the other. This doesn’t happen very often. A slightly more common reason for inconsistent data is changes in reporting practices, such as when a new staff member takes over the reporting function and interprets differently the definition provided by NCES. But perhaps the most common source of inconsistent data is changes in enrollment policies and practices. One particular area that has become problematic in recent years is how to count dual enrollments, that is enrollments among high school students taking college-level courses that count both toward their high school graduation requirements and, potentially toward subsequent college matriculation.
One interesting example of such a change occurred throughout the state of Oklahoma, where a growing “Cooperative Alliances Agreement” has facilitated a large expansion in dual credit and other enrollments credited toward Oklahoma Technology Centers. Unfortunately, it is not possible to tell how much of the apparent growth is related to a different accounting practice and how much is related to an actual increase in participation promoted through this more integrated and comprehensive approach. Because of this, we excluded the individual Oklahoma technology centers from our lists.
The bottom line on data integrity is that we do our best to weed out obvious errors but we do not uncover every case. And even if the data are technically correct, they will not necessarily agree with enrollment counts you find on a specific college’s Web site or when you contact an official at the college who may use different definitions.
The Top 50 Colleges Lists
As in prior years, we provide four Top 50 growth lists as determined by the size of the institution. By doing so, we accommodate two issues when comparing growth: for large institutions, relatively large numerical changes represent smaller percentage increases; for small institutions small numerical changes result in relatively larger percentage increases. By stratifying institutions into four categories based on Fall 2006 enrollment (less than 2,500; 2,500 to 4,999; 5,000 to 9,999; and 10,000 and over), we reduce the size confound in our comparisons.
If we did not stratify institutions by size, the Top 200 percentage growth institutions would include 100 in the smallest size group, 49 from the second group, 38 from the third size group, and only 13 from among the largest institutions. The Top 50 in overall percentage growth would include 40 from the first group and only 2 institutions from the largest group. Conversely, if we examined absolute numerical growth among all institutions, the Top 200 would include 78 of the 10,000 and higher enrollment group, 73 from the next largest group, 36 of the second to smallest size category and only 13 institutions from the smallest size group. The Top 50 numerical increases would include 32 institutions from the largest size category and none of the smallest size institutions.
With our stratification in place, the resulting four lists are each ordered by percentage growth, which is the current year enrollment minus the prior year’s enrollment, divided by the prior year’s enrollment. As generally expected, the leading percentage increases are highest among the smallest size institutional category and lowest among the largest size institutional category.
There are two notable exceptions. The single largest institutional growth is found in the second size category (2,500 to 4,999 students) where the Wabash Valley College campus of Illinois Eastern Community Colleges grew by more than 50 percent, from an enrollment of 3,155 in Fall 2005, to 4,840 in Fall 2006. The other outlier is found in our largest institutional size category, where the fastest growth of 30.5 percent is recorded by Wayne County Community College District, which increased from 14,764 to 19,265 in one year.
Overall Enrollment Trends
In prior years, we summarized the overall enrollment growth of our target institutions by first placing them within the context of all accredited postsecondary institutions. We do so again this year but have had to make some changes to our summary table to accommodate the inclusion of four-year associate colleges in our target group.
The first table presented in this introduction shows this summary, with the public, two-year and four-year associate colleges shown as part of the target population. The rest of the table shows the remainder of four-year institutions (public baccalaureate and higher, as well as some specialized institutions and all private, nonprofit and private for-profit), the remaining two-year institutions (private, nonprofit and private, for-profit), as well as institutions that offer less than two-year awards (public, private, nonprofit and private, for-profit).
Our target category accounts for just less than one-fifth, or 18 percent, of all postsecondary institutions but enrolls more than one-third, or 36 percent, of all students. Comparatively, four-year institutions (excluding the associate colleges) account for more than twice as many institutions, or 40 percent of the total, but not quite twice as many enrollments at 60 percent of the total. The remaining categories account for more than one-half of all institutions but only four percent of all enrollments.
Enrollment among the public, two-year and four-year associate colleges increased by just more than one percent between Fall 2005 and Fall 2006. However, the increase was much more evident among the small group of four-year associate colleges than among the larger group of public, two-year institutions. It may seem incongruous that this group experienced so much growth but only five institutions show up in the four Top 50 lists. That is because this growth is more a product of an increase in the number of institutions in this category than among the growth of existing institutions. Although not shown in the table, there were only 34 such institutions that produced the Fall 2005 enrollments. The three institutions that joined the ranks between Fall 2005 and Fall 2006 are responsible for the majority of this apparent enrollment growth.
The rate of growth for other four-year colleges was higher than among our target group, especially among the small “for-profit” component of this group. Interestingly, among the “less than two-year” group, which is dominated by “for-profit” institutions, there was more notable growth among the public and private, non-profit components.
The second summary table relates to our stratification of institutions by size. It shows that the group containing the largest number of institutions, with 40 percent of the total, is the group of colleges with the smallest enrollments—the less than 2,500 students category. However, due to their small size they enroll a far smaller proportion of all students—some 8 percent. Conversely, the 16 percent of institutions that enroll 10,000 or more students account for almost one-half of all enrollments. The pie charts illustrate these inverse proportions in number of institutions compared to proportion of total enrollments by size category.
The overall pattern of growth across these categories shows that only one category, institutions enrolling between 2,500 and 4,999 students, experienced an overall decrease in enrollments. The greatest percentage increase in enrollments occurred among the next largest size category—5,000 to 9,999 students. It is important to note that these enrollment changes reflect both changes in institutional enrollments as well as the movement of institutions from one size category to another between the two years.
We close this year’s analysis with two new trend figures that illustrate the changing composition of public, two-year and associate college four-year institutions over the past ten years. The enrollment trend chart shows a marked increase between 1999 and 2002 in the enrollments accounted for by the largest sized institutions. The other three categories of institutions are relatively flat in their overall enrollments, with slight increases in recent years among the second largest size category — 5,000 to 9,999 students — and very slight but steady decreases in enrollments among the smallest size category.
The chart showing the number of institutions by size category shows a complimentary trend: the number of the smallest institutions decrease notably through 2002 with the other categories remaining relatively flat. Within the less pronounced trends, the largest sized institutions exhibit a modest increase in number during the years in which there were large enrollment increases and the second to largest category — 5,000 to 9,999 students — shows modest increases in number of institutions in more recent years.
Since we started publishing this analysis in 2001, overall enrollments among public, two-year and four-year associate colleges have increased by 7.4 percent from 6 to 6.5 million students. The number of such institutions has increased by only 1.8 percent, from 1,160 to 1,181.
This increase in productivity is critical to our nation’s educational needs, especially at a time of increasing demands by a variety of public sector institutions for increasingly scarce public resources.
Victor M. H. Borden is associate vice president at Indiana University and associate professor of psychology at IUPUI.