Robert J. Morse, Director of Data Research,
U.S. News rmorse@usnews.com
Presented at 2005 SAIR Annual Meeting
Charleston, SC October 23, 2005
1983, 1985 and 1987 reputation only
Published annually starting in 1987 in U.S. News magazine and in a separate guidebook
Starting in 1988 the ranking methodology has been a mix of reputation and statistical data
Other rankings: Best Values, Best Publics, Debt load, Diversity, Undergrad business/engineering
Since 1997 America’s Best Colleges rankings online at www.usnews.com. More rankings/data online.
August 18, 2006-The next rankings from
America’s
Best Colleges 2007 edition released on usnews.com
To help prospective students and their parents make informed choices about...
expensive investment in tuition, room & board, etc. cost is around $160,000 plus in some cases
one-time career decision
Provide the public and prospective students with an understanding of latest trends in higher ed
Give practical advice on many aspects of attending, financing and applying to college
As one part of our ongoing reporting on educational issues
Based on accepted measures of academic quality
Provide comparable, easy-to-read and accessible statistical statistical information on a large number of colleges
U.S. News’ ranking process is totally independent (unbiased) of information produced by a college or university through view books or other materials that students receive in the mail.
As one tool in college application process
U.S. News stresses need to consider: cost, location, course offerings, alumni network, state of a school’s facilities, placement success, visiting the school, faculty, input from counselors and parents, personal fit
Research and feedback suggests this is how students use the rankings
Feedback to U.S. News indicates readers value the directory, college comparison and search on web
Studies show that applicants, in general, do use rankings appropriately: UCLA Freshman & Art &
Science
Reasons noted as “very important” influencing student’s decision to attend this particular school (11 out of 18)
1 . College has very good academic reputation 60.4%
2. Colleges graduates get good jobs 52.1%
3. A visit to the campus 44.1%
4. Wanted to go to a college this size 41.8%
5. Offered financial assistance 37.1%
6. The cost of attending this college 33.9%
7. Grads get into good grad/professional schools 32.4%
8. College has a good reputation for Social Activities 29.1%
9. Wanted to live near home 20.1%
10. Information from a Web site 15.9%
11. Rankings in national magazines 14.8%
12. Relatives wanted the school 9.9%
13. I was admitted through an early decision/action plan 9.3%
(2004 freshman norms based on 289,452 students at 440 4-yr schools)
SourceBooks Inc. publisher @ $26.95
Sold in bookstores in late September 2005; will be third annual edition.
1762 pages: 92 pages on How to Choose the
Right College: 90 pages of lists/tables, 128 page
Index of Majors and 1443 pages of directory listing full profiles of over 1400+ schools.
Book does not contain full Best Colleges ranking tables. Each school’s ranking is in book.
Book will continue to be published annually
Recently, Number 1 seller in category.
Why U.S. News is trying to collect NSSE data?
NSSE data published only on usnews.com.
Schools grouped Carnegie group, listed alphabetically/national average for question.
U.S. News now largest publisher NSSE data
5years of data on our site; data collection again in 2006 of 2005 NSSE survey participant schools. Approximately, 30% of the 470 2004
NSSE survey schools sent in data to U.S. News.
• Collected exemplary program nominations in 2006 edition, fourth consecutive year.
• President, Provost and admit deans asked to make nominations for all 4-year schools.
This was the first year that the admit deans were surveyed.
• Not part of the overall ranking.
• Alphabetical list schools with most number of nominations published.
• Takes 7 votes to make alphabetical listing.
These are the 8 program areas that we are surveying. Results are from Best Colleges
2006 edition. On Free section of site. More schools are shown online than in print.
First-year Experiences
Service Learning
Study Abroad
Senior Capstone or Culminating Academic Experiences
Writing in the Disciplines
Undergraduate Research/Creative Projects
Learning Communities
Internships, Cooperative Education, or Practica
Allegheny College (PA)
Alverno College (WI)
Brown University (RI)
Calvin College (MI)
Carleton College (MN)
College of Wooster (OH)
Duke University (NC)
Elon University (NC)
Grinnell College (IA)
Harvard University (MA)
Massachusetts Inst. of Technology
Portland State University (OR)*
Princeton University (NJ)
Reed College (OR)
Southern Illinois U.–Edwardsville *
Stanford University (CA)
St. John's College (MD)
Swarthmore College (PA)
Truman State University (MO)*
University of Chicago
Univ. of Missouri–Columbia *
Univ. of South Carolina–Columbia *
Worcester Polytechnic Inst. (MA)
*denotes a public school
Alverno College (WI)
Antioch College (OH)
Berea College (KY)
Butler University (IN)
Cal Poly–San Luis Obispo *
Drexel University (PA)
Elon University (NC)
Evergreen State College (WA)*
Georgia Institute of Technology *
Kalamazoo College (MI)
Kettering University (MI)
Northeastern University (MA)
Portland State University (OR)*
Rochester Inst. of Technology (NY)
University of Cincinnati *
Univ. of Maryland–College Park *
Virginia Tech *
Worcester Polytechnic Inst. (MA)
*denotes a public school
• Feedback and suggestions from AIR members have been very important to U.S. News-many analytical ideas, data quality protocols and communication problem solutions: AIR/USNEWS Committee,
HEDPC and AIR regional meetings
• IR plays key role in filling out statistical surveys
• AIR members often have to explain the U.S. News rankings to campus higher-ups
• AIR members often do benchmarking against peer schools using U.S. News indicators and or U.S.
News data.
• U.S. News doesn’t expect criticism to end: this is just the right thing to do.
During the week of August 15, 2005 responders to our main and fin aid statistical surveys were notified via email by U.S. News of the publicity schedule, etc. that was planned for the release of
America’s Best Colleges (2006 edition) rankings on
August 19, 2005.
Survey responders were sent the same email notifications and table PDFs at the same time
U.S. News sent them to college Public Relations offices . No longer second class citizens vs. PR.
Same procedures will be followed during week of
August 14, 2006.
AIR/USNEWS Committee behind this change.
During the week of August 15, 2005 U.S.
News sent out free 1-year access to schools that filled out the U.S. News main statistical survey or financial aid survey.
That free access will last until August 2006.
Free access covers the full America’s Best
Colleges rankings, web college-directory, full college-search functions and any other college data on the site.
New data appeared August 19, 2005
1-yr. free policy will continue for upcoming rankings & data launch starting in Aug. 2006.
You received the 2006 Edition America’s Best Colleges rankings tables in Excel as published on usnews.com on
August 19, 2005. Needed to send email request to U.S.
News to receive the Excel spreadsheets.
Provided free to AIR members to analyze U.S. News data for campus constituencies.
Full ranking data set is not available , only data from the ranking tables published online is available.
Announcements made in Electronic Air and Air Alert when the new data was available from 2006 Best
Colleges edition. .
At this time we can only promise 2005 rankings (2006 edition) in Excel, not entire archive.
Unpublished ranking available by request.
AIR members or other officials from schools can send in a fax on school letterhead to 202-955-
2263 to request unpublished rankings: either
Best Colleges or Best Grad.
The fax request should say something like,
“Please send our school the unpublished details rankings from the 2006 edition of American Best
Colleges.”
Open Door Policy: meetings at U.S. News DC offices with College administrators with questions on details of how rankings work.
For the first time, U.S. News published the percentage of Pell Grants recipients per school both in our newsstand guide book and online.
Called “Economic Diversity.” In free section.
The Pell Grant data from the Dept. of Education via Tom Mortenson of postsecondary.org
We published 2003-2004 Pell Grant recipient data as a proportion of each school’s total undergraduate enrollment and Pell Grant % for top ranked schools National Univ. and Liberal Arts.
Plan to make this an annual feature
University Endowments. U.S. News collected the dollar amount of each school’s endowment at the end of fiscal
2004. This question used information supplied by school on the Fiscal 2004 IPEDS Finance survey.
Published online for each school in the free section on the “At a Glance” page
It will not be used in the ranking model.
Plan to continue collecting endowments annually.
IPEDS definition: says include independent foundations, not sure reported to USNEWS that way.
If incorrect online can be changed.
On November 17, 2005 the Carnegie Foundation for the
Advancement of Teaching's will announce publicly a "new" multi-dimensional classification system. There has been an early version out for testing.
Carnegie says they will also revise the "basic" current
Carnegie Classifications in December 2005. The current
Carnegie Classifications have been the basis of the U.S. News
America’s Best Colleges ranking categories since at least 1987.
U.S. News will wait until after the December 2005 release of the revised "basic" current Carnegie Classifications to give our official reaction.
U.S. News is hopeful that we will be able to once again use the
"revised basic" Classification to determine the Best Colleges categories for the rankings that we will publish in August 2006, the 2007 Edition of America's Best Colleges.
The Carnegie revisions will surely result in some schools changing America's Best Colleges categories and some schools being ranked for the first time.
The revised "basic" Carnegie Classifications will impact the America's Best Colleges rankings categories in calendar 2006.
A school's new U.S. News category, if there are changes, will be reflected in the America's Best
Colleges Peer Assessment surveys sent out around 4/1/2006.
Co-op school proposal that co-op program students
(those in co-op program jobs, not attending classes) enrolled at IPEDS fall official enrollment period be excluded from enrollment when U.S. News calculates expenditures per student in financial resources calculation.
Moody’s view
Why the proposal was made
U.S. News response
Co-op counts would be collected by U.S. News next year.
Cross-checking. Our goal is to increase the amount of cross-checking that we are able to do from official sources. One type of data that we would cross-check would be: # applications, # acceptances, SAT/ACT scores. Used IPEDS
Peer analysis from Fall 2005 Institutional
Characteristics survey.
We currently do cross-check 6-year graduation rates with NCAA and IPEDS COOL, and faculty salaries with AAUP.
What proportion of schools on each survey does the typical respondent rate on the U.S.
News Peer Assessment Survey? How many votes does each school get?
U.S. News stresses that respondents should choose “don’t know,” if they have limited knowledge of a school. What proportion of respondents are either marking “don’t know” or are not-rating a school.
What do the results show about the behavior of survey respondents?
U.S. News Peer Assessment survey conducted Spring 2004, results published in America’s Best Colleges 2005 Ed.
Total number of schools rated: 1,364
Total number of surveys mailed out: 4,950
Total number of surveys returned: 2,453
That equals an overall 60% response rate of surveys mailed out to Presidents,
Provosts, Admit deans at each school.
Results from 2,453 surveys returned by respondents
Average percentage of schools rated per respondent: 56%
Average percentage of schools not-rated per respondent: 44%
Median percentage of schools rated per respondent: 55%
Results from 2,453 surveys returned by respondents
Standard Deviation of the average percentage of schools rated per respondent:
29%
68% of the respondents rated between 27% to 85% of schools (Or + or – 1 standard deviation)
Overall results are from all 1,364 schools rated
Each school was rated by an average of 56% of the respondents.
Median percentage of respondents rating a school: 55%
Standard Deviation: 17%
Overall results are from all 1,364 schools rated
The lowest percentage of respondents that rated a specific school was 14%.
The highest percentage of respondents that rated a specific school was 95%.
Conclusions:
• Respondents are responsible using the
“don’t know” or non-rating option when deciding to rate or not to rate a school.
• Schools’ average peer assessment scores are based on ratings of a credible number of respondents.
• Results debunk the myth that a very large percentage of survey respondents are rating nearly all the schools on the survey.
Conclusions:
• Debunks myth that everyone rates Harvard,
Yale and Princeton.
• Debunks myth that the average peer assessment scores that U.S. News publishes for each school are based on a statistically
“insignificant” number of raters.
• Debunks myth that one respondent's rating can overly influence the peer assessment score that U.S. News publishes for each school.
Conclusions:
• Distribution of the proportion of schools rated is reasonable given the size of the
Standard Deviations
• Supports premise that respondents are rating schools where they have “some degree of knowledge.” Thus, schools are being rated by respondents whom know something about them.
Further things to look at:
• Reasons behind small minority of respondents rating 90% or more of the schools. Anecdotally, we know of some reasons why this happens. Why? Rating by
Committee. Does this account for the large
% of schools being rated?
• Difference if any in: the voting behavior of
President’s, Provosts and Admit Deans, regional differences and public vs. private.
A factor in the spread of the assessment movement in U.S.; the
National Survey of Student Engagement (NSSE) started to be a counterweight to U.S. News.
Part of growing accountability movement: colleges/grad schools have increasingly had to account for and/or explain actions undertaken, funds expended and how students and graduates perform and learn. Greater at Publics than Privates.
Influences admissions and other academic policy decisions made by schools. How real are claims?
Prospective applicants and enrolled students have become active consumers and have been given much more information to make independent judgments.
Has resulted in higher quality data: Common Data Set and IPEDS collecting more consumer data and posting it on IPEDS COOL.
Created a competitive environment in higher education that didn’t exist before. Competition makes everyone better and helps students.
Annual public benchmark for academic performance-moving up the rankings has become a goal of some college presidents/boards/deans.
Promotes quality in higher education.
This public benchmark helps schools, that are not top ranked, move up and show that they have made “real measurable progress.”
Created a “new class of elite schools” (Robert Samuelson). The “old elite” schools, the “Ivys” still exists.
Has led to the “commoditization” of college data in the U.S.
If survey not returned with latest fall 2004
IPEDS enrollment included, school receives
Footnote 1-School declined to fill out U.S.
News survey.
Estimates aren’t printed, but published as N/A.
The following default and estimate protocols used rankings published in August 2005.
If school doesn’t respond to survey and has
CDS posted or other comparable data available on the school’s site, we will use the data from the school’s site and footnote this.
Z Student selectivity = Z test score * (50%) + Z high school class standing * (40%) + Z acceptance rate * (10%)
Z Grad and retention = Z avg. 6-yr. grad rate *
(80%) + Z avg. fresh retention rate * (20%)
Z faculty resources= Z avg. faculty salaries *
(35%) + Z fac w/term degree * (15%) +Z % fac ft * (5%) + Z Student fac ratio * (5%) + Z % class < 20 * (30%) + Z % class 50 or more *
(10%)
Z academic reputation * (25%) + Z alumni giving* (5%) + Z financial resources * (10%) +
Z student selectivity * (15%) + Z graduation and retention * (20%) + Z faculty resources * (20%)
+ Z grad rate perf. * (5%) = each school’s total weighted Z score
Z = each school’s Z-score for that variable
Overall score for school X = school X’s total weighed Z score/highest weighted Z score of school in X’s category: rounded nearest whole number
Ranking model for the rankings published
8/19/2005.
Universe of ranked schools is ~1,360 regionally accredited 4-year colleges that enroll first-time, first-year, degree-seeking undergraduate students.
Classify colleges into categories using the 2000
Carnegie Classification of Institutions of Higher
Education by the Carnegie Foundation.
Data gathered and analyzed on up to 15 indicators that measure academic quality.
Weights assigned to these 15 indicators.
Schools ranked against their peers, in their category based on their overall weighted scores.
Doctoral/Research Universities-Extensive and
Doctoral/Research Universities-Intensive =
National Universities
Master’s Colleges and Universities I and II are combined then divided into 4 U.S. geographic regions = Universities-Master’s
Baccalaureate Colleges-Liberal Arts = Liberal
Arts Colleges
Baccalaureate Colleges-General +
Baccalaureate/Associate’s Col. divided into 4
U.S. geographic regions = Comprehensive
Colleges-Bachelor’s
Peer Assessment
Retention
Faculty Resources
Student Selectivity
Financial Resources
Graduation Rate
Performance
Alumni Giving Rate
Financial Resources Peer Assessment
Faculty Resources Retention
Student Selectivity Alumni Giving Rate
Graduation Rate
Performance
15%
20%
10%
20%
5%
5%
25% Assessment
20% Retention
20% Faculty Resources
25% 15% Selectivity
10% Financial Resources
5% Grad Rate Perf
5% Alumni giving
25%
20%
10%
5%
25% Assessment
15% Selectivity
20% Faculty Resources
25% 25% Retention
10% Financial resources
5% Alumni giving
15%
Step one: create a z-score for each indicator in each U.S. News category using the data from the appropriate schools
Step two: percentage weights that U.S. News uses are applied to the z-scores
Weighted z-scores for each school are summed
Overall score for each school = each school’s total weighted z-score/ highest total weighted zscore for a school in each U.S. News category
Top school’s overall score in each category
=100; other overall scores are sorted in descending order.
Formula to standardize data before weighting
Z-score = (X-U)/SD
X= each school's data point for that particular indicator in that category only
U= average of that indicator for that year of reported data in the school’s category
SD= standard deviation of that indicator for that year of reported data in the school’s category
Z-scores re-scaled for negative values: No rescaling Z-scores negative values on individual indicators since 2002
90 = A school's top 10% high school class standing
60 = average top 10% among colleges reporting in that category.
5 = standard deviation top 10% high school class standing in that category
(90-60)/5 = (30/5); “Z” score or standardized value for weighting = 6
Why?
Reputation for excellence helps graduates with jobs and further education in graduate school
How?
Surveys of educators: president, provost, and dean of admission at each school. Rate only schools within their U.S. News categories like National
University, etc.
Rate school’s academic quality of undergraduate program on 1 “Marginal” to 5 “Distinguished” scale with “don’t know”option
Average peer score = total score for those rating the school 5 through 1 among those respondents for that school/number of respondents that rate that school in that category.
Surveys conducted in winter/spring prior to publication.
Estimates or defaults: None are used for this variable.
Why?
Abilities and ambitions of students influence the academic climate of the school
How?
High school class standing: the percent of enrolled students who graduated in top 10 % or 25% of high school class (40%)
Test scores (average SAT/ACT) (50%)
Acceptance rate (Accepted/Applied) (10%)
From current year USN survey
If not reported on current year USN survey, use from last year’s USN survey.
If not reported for either year, then estimate= one standard deviation less category’s mean.
If percent submit H.S. Class standing < 34% or null, then use estimate. A footnote appears if the percent submitting is less than 50.
Changes made in how we estimated HS class standing if a school had a small percent submitting high school class rank…..
The estimate used in HS Class rankings if
<34% submit, then 75% of the actual HS class standing submitted was used in the ranking model. For example: if HS submit =25% and
T0P 10% = 50% then .75 x .50 = .375% (value of HS standing used in ranking model).
Determine which score used in admissions
SAT/ACT policy questions from CDS. If required and either accepted, use % submit SAT and ACT to determine most frequently used.
Use average Math & Verbal, if reported. If average not reported, estimate using the midpoint of the 25 th and 75 th distribution. The same is done for the ACT composite score.
Scores converted to SAT or ACT national distribution.
600 V=79% + 600 M =76% /2 =155/2 =77.5.
That 77.5% score is used in model for z-score calculation. The same is done for ACT.
If test score not required and percent submit <
50%, then use SAT/ACT percentage *.9.
All international, all minority, all Student athletes, all legacies, all student admitted special circumstances, all summer enrollees with scores reported. If Yes or N/A, SAT/ACT percentage used as is. If No/Null for any of them then
SAT/ACT percentage * .9.
If SAT/ACT not reported on this year’s USN survey, use last year’s score.
If null for both years, then estimate is one standard deviation less than the mean for the category.
Note = .9 reduction is roughly reducing SAT scores by 10% of total combined SAT percentile distribution for ranking purposes. That means
80% x .9 = 72% for purposing of the ranking model. This does not change what score will be published on the ranking tables.
Acceptances/applications, then take inverse to create rejection rate.
If not reported on current year USN survey, data is used from last year’s USN survey.
If both year’s data are unavailable, then use the estimate of 1 standard deviation less than the category’s mean
Why?
A measure of how satisfied students are with a school
To assess if a school is providing the courses and services students need for timely graduation
How?
Average Freshman retention rate (20% of category)
Average Six-year graduation rate (80% of category)
Average Freshman Retention rate: average of retention rates for classes entering in 2000 to
2003.
If less than 4 years are reported, then a footnote indicates this.
Use previous year’s survey data for three years.
If no information is reported, the estimates is 46
+ .54 * average 6-year graduation rate.
If the average 6-year graduation rate is blank, then the estimate is one standard deviation less than the category’s mean.
Average 6-yr. Graduation rate: average of sixyear graduation rates for the cohort of students entering from 1995 to 1998.
If less than 4 year’s rates are used for average, a footnote indicates this.
If school is a non-responder and didn’t return its
US News statistical survey, data from previous years are footnoted.
NCAA and IPEDS data are footnoted.
Nat U. and Liberal Arts: most recent 6-yr. Rate printed on ranking tables. 6-yr avg. on web.
For NCAA DIV. I, II, and III compared NCAA reported rate with what school reported to USN for 95 and 96 cohorts for 97 entering cohorts, in no NCAA use IPEDS, if not EXACT match then substituted
NCAA/IPEDS for USN for that year.
If no USN is available, then the NCAA/IPEDS is used.
If no NCAA/IPEDS then use previous years USN data.
If no NCAA/IPEDS or USN, but freshmen retention available, estimate made off freshman retention.
estimate = 1.3 * freshman retention - 44.2
If no data at all, then estimate is one standard deviation less than category’s mean.
Why?
To measure the nature of student-faculty interaction
To assess the quality and commitment of a school’s faculty
How?
Class size--most small and fewest large classes (less than 20 students 30% and 50 or more students 10%)
Faculty salaries adjusted for cost of living (35%)
Proportion of faculty with top degree in field (15%)
Student-faculty ratio (5%)
Percent of faculty that is full-time (5%)
% < 20 class size: number of classes less than
20/total number of classes
% 50 or more class size: number of classes 50 or more/total number of classes; then take inverse to find % that are not 50 or more.
Defaults: If not reported in current year will substitute data reported last year to USN.
If no data for current or previous years, then estimate.
The estimate is one standard deviation less than the mean or half the mean if the standard deviation is greater than the mean.
Average faculty salary including fringe benefits:
Professor, associate, assistant ranks (not instructor rank). Averaged over two most recent years.
If only one year is available those data are used.
Cross-checked with AAUP faculty salary, if average didn’t match, use AAUP. If no USN, then use AAUP.
If no AAUP, then use estimate of one standard deviation less than the category’s mean, after cost of living adjustment.
Salary first adjusted for Cost-of-living using
Runzheimer International, 300 City/Metro Area Index,
Family 4, $60,000 income level. No metro area use statewide average. Not changing in 2005 vs. 2004 .
35% of faculty resources.
Number full-time faculty with a terminal degree/total number full-time faculty.
If the data are not reported on the current year’s survey, then use last year’s survey data.
If there is no data from last year, then an estimate is used.
The estimate is one standard deviation less than the category’s mean.
Student-faculty ratio: Self-reported by school using the Common Data Set definition.
Value standardized is the category’s maximum the school’s student-faculty ratio.
If data for the current year is not available then use last year’s student-faculty ratio.
If there is no data, the estimate is 1 standard deviation less than that category’s mean.
Calculation: Full-time faculty/full-time faculty +
(33.3% * part-time faculty) from U.S. News survey.
If faculty data are not reported on current year survey, then use last year’s data.
If last year’s data is unavailable, use the estimate of 1 standard deviation less than the category’s mean.
If school says 100% full-time then will double check likelihood of that claim. If claim seems unlikely (research or large university with no part-time faculty), then use estimate.
2005=2006 edition
Private Colleges and Universities
IPEDS Finance Total Expenses Column
Education Expenses = ((Research + Public
Service) * percent full-time equivalent enrollment that is undergraduate) + Instruction +
Academic Support + Student Services +
Institutional Support
Educational Expenses per student = Education expenses/total full-time equivalent enrollment
2005=2006 edition
Public Colleges and Universities
IPEDS Finance Total Expenses Column
Education Expenses = ((Research + Public
Service) * percent full-time equivalent enrollment that is undergraduate) + Instruction +
Academic Support + Student Services +
Institutional Support + Operations/Maintenance
Educational Expenses per student = Education expenses/total full-time equivalent enrollment
IPEDS Finance Public and Private reporting rules are different: O&M, Scholarships, depreciation. New GASB rules now required.
Full-time equivalent enrollment = (total fulltime undergrads + total full-time post baccalaureate) + .333 * (total part-time undergrads + total part-time post baccalaureate)
Percent full-time equivalent undergrads = fulltime equivalent undergrads/full-time equiv. enrollment
Education expenses per student are averaged over the two most recent years fiscal 2003 and fiscal 2004.
After calculating each school’s education expenses per student (adjusted for research and public service) we applied a logarithmic transformation to the spending per full-time equivalent student. This was done for all schools.
That transformed value was then standardized before the 10% weight for financial resources was applied.
Small number of schools that fall outside of 2SD of the mean of this parameter. Changing parameter by its log doesn’t change distance between values for the overwhelming majority of cases within 2SD of the mean.
This transformation does reduce value of the few outliers outside of 2SD of mean, reducing their impact in ranking model. It doesn’t change their place, still leaders, but does reduce the contribution of this one indicator to the school’s overall score.
This corresponds to what U.S. News and many in higher education believe about the effect of spending on education quality, that beyond a certain level an increase in spending does not lead to a proportionate increase in quality.
If fiscal year 2004 data is not provided, then the data only for fiscal year 2003 is used or vice versa.
If data is missing for both years, then a percentage of the category’s mean is used, prior to taking the natural log.
For National Universities, 50% of the mean is used. For Liberal Arts Colleges, 66.7% of the mean is used. For Universities--Master’s, 70% of the mean is used. For Comprehensive
Colleges--Bachelors, 75% of the mean is used.
Why?
An outcome measure of the school’s role in the academic success of students.
Does the school over- or under-perform with respect to graduation rates of students?
Only in National Universities and Liberal
Arts categories--not in others.
How?
Measured as the difference between expected and actual six-year graduation rate.
Regression model used
Dependent variable: 6-year graduation rate
Independent variables: high school class standing, standardized test score, financial expenditures, and institutional control
Independent variables taken for corresponding cohort and expenditures during the first 4 years of cohort
If 6 year grad rate is unavailable, the average 6 year grad rate of the previous 3 years is used
Why?
A rough proxy for how satisfied graduates are with their alma mater.
How?
Average percentage of undergraduate alumni with undergraduate degrees who contribute in most recent two-year period. Grad degrees are excluded.
Alumni giving rate is calculated separately for two most recent years and then averaged: undergraduate alumni donors/undergraduate alumni of record.
If only one year is reported on the USN survey, then the one year’s rate is used instead of the two year average.
If not reported, then use last year’s survey.
If the data is unavailable for both years, then use
Council for Aid to Education data.
If there is no USN data and no C.A.E. data, then use one standard deviation less than the category’s mean as an estimate.