|
Emory University |
Since discovering data
discrepancies last May, Emory has been investigating reports provided to
outside groups—including
the federal government—evidently designed to make Emory appear more selective than it
actually was.
The investigation focused on
three areas: whether incorrect data were
submitted; who was responsible; and how and why the practice began.
Findings showed that there had
been intentional misreporting over more than a decade and that leadership in
the Office of Admission and Institutional Research was aware of and
participated in the misreporting.
USNWR and other parties
involved in data collection for the
Common Data Set initiative were alerted to
the discrepancies in early June and chose to sit on the information until Emory
made public the embarrassing report.
Conducted under the supervision
of an independent third party, the investigation found that Emory used SAT/ACT
data for “admitted” students instead of “enrolled” students since at least
2000.
Stephen Spencer, Emory’s senior
vice president and general counsel, confirmed it was an “intentional decision”
to report incorrect data overstating test scores presumably in an attempt to improve
rankings by making the school appear more selective.
In addition, the report found
that Emory “may have” excluded the scores of the bottom 10 percent of students
when reporting SAT/ACT scores, GPA’s and other similar information. Staff members responsible for these
discrepancies are no longer employed by Emory.
Here is how it worked. In 2009, Emory claimed that SAT scores for
the 25th to 75th percentile of enrolled students was
1300-1480 (Math and Critical Reading), when in fact it was 1260-1440. The next year, Emory reported that the middle
percentile of enrolled students was 1310-1500, when it was actually 1270-1460.
GPA information was similarly
cooked. For 2009, Emory reported that 85
percent of enrolled students were in the top 10 percent of their class, when really
only 76 percent achieved that level of accomplishment. In 2010, Emory covered-up a drop to 75
percent in the same reporting category by incorrectly listing the top decile as
87 percent.
In
an article on the USNWR website,
USNWR Editor and Chief Content Officer Brian Kelly said the situation
is under review and that the faulty data “would not have changed the school’s
ranking in the past two years (No. 20) and would likely have had a small to
negligible effect in the several years prior.”
This year’s annual
Best Colleges edition of
USNWR is scheduled to be released in
mid-September.
Emory’s scandal renews
questions about how
the CDS collects data, what training is provided to staff
submitting data, and who is overseeing the accuracy of data provided to the
public.
After both Claremont McKenna
College in California and Iona College in New York similarly admitted
falsifying CDS information for the purpose of gaming the rankings, officials
responsible for overseeing the CDS at USNWR,
Peterson’s, and the College Board
were contacted for fuller explanations of the process behind data collection
and reporting.
So far, none of the three
organizations has responded to questions concerning training and oversight of
college-based staff responsible for providing CDS its data.
Outside of a “listserve” where anyone
may post questions and ask for “peer” assistance, there appears to be little
coordination or supervision of reporting procedures—despite what has evolved into a
multimillion dollar business for the parties involved.
Monitoring of the listserve
over a six month period suggests it’s a little-used resource, but when
questions arise, the variations in policy and procedure among different
colleges is nothing short of astonishing.
How much the scandal will hurt
Emory University remains to be seen. But
the larger issue should focus on how various organizations providing data to
the general public—at a significant cost—intend to clean the mess up.