About a year ago,
the ACT organization announced what appeared to be subtle
changes for 2015-16, mainly in the Writing section of the test.
In a carefully worded news release, ACT described changes starting
with the September 2015 test as “designed to improve readiness and help
students plan for the future in areas important to success after high
school.” In general, ACT proposed to
tweak the optional Writing Test in small—possibly
unnoticeable—ways.
And while the
1-to-36 scale would remain the same, ACT indicated students would also be
evaluated in new areas of writing competency, including ideas and analysis,
development and support, organization, and language use.
But it wasn’t until ACT recently announced changes in reporting
documents provided to both students
and colleges
that the full story came clearer.
In draft versions of score reports planned for schools
and students,
it’s evident that ACT not only wants to provide information on student test performance
in five core sections of the test (including the optional Writing section), but
also wants to chop and dice it into a series of 11 sub- or “domain” scores,
including everything from “rhetorical skills” to “ideas and analysis,” all of
which scored on a scale of 2 to 12.
In addition, ACT will generate two new hybrid scores in English
Language Arts (ELA) and Science, Technology, Engineering, and Mathematics
(STEM), based on various combinations of English, Reading and Writing or
Science and Math scores. The new report will
also provide terse one- or two-word written assessments of “understanding
complex texts.”
And for colleges correlating career readiness with retention
and completion, there’s a little bronze-to-gold level rating “certifying” skills
critical to future education and career success.
But of somewhat greater concern are assessments provided to
approximately 450 institutional participants in ACT
Research Services of “Overall GPA Chances of Success”
in various general categories of majors including education, business administration,
liberal arts, and engineering, as well as “Specific Course Chances of Success”
in broad areas such as freshman English, college algebra, history, chemistry,
psychology etc.
Chances of success are made in terms of those students
likely to receive a “B” or better in these areas or those students likely to
receive a “C” or better. And
they are nowhere to be found on the ACT report provided to students and families.
According to information provided by ACT, chances
of success are calculated by using
- Student-reported information gathered as part of the registration process, including high school GPA and specific course grades earned
- Performance on the student’s ACT test
- Data provided by participating colleges/universities about the previous year’s enrolled students including the college grade average and course grades achieved by first-year students
During registration, students are asked to voluntarily
report grades in core academic courses.
These grades are converted by ACT to an unweighted GPA on a 4.0 scale. None
of this data comes from the high school and there is no obvious mechanism for
verifying its accuracy, although students are clearly warned, “The information
you give may be verified by college personnel.”
While these kinds of assessments aren’t exactly new, the
intense interest in marketing chances of success to unnamed colleges purchasing
a service that estimates student potential based on information
reported by the test-taker computed together with scores and historical data provided
by the institution is troublesome.
In other words, through the college score report forms, ACT effectively
gets more actively involved in the college admissions process by projecting for
admissions readers how likely it is that an applicant would not only succeed at
their institution but also in their chosen field of study.
With the score report in hand, an admissions officer noting
that a particular applicant has indicated a desire to major in business on
their application will be able to see how likely ACT thinks it will be that the
student will actually succeed in his or her major. And admissions could choose to admit, deny or
recommend another major based on this speculation.
But students are left completely in the dark, as nothing
appears on documents they receive that would reveal what ACT is suggesting
about their chances of success at a specific institution. Chances of success do not appear on the ACT
Student Score Report because, according to ACT, “the college owns the
information” and “by sending their test scores to a college, students give the
college permission to use the data as they see fit.”
Yes, nearly all colleges already have enrollment management
software that does something similar.
But ACT should be in the business of writing and administering tests—not
getting in the middle of college admissions decisions.
It’s one thing for standardized tests to be important factors
in admissions, but now ACT proposes to pass judgment on chances of success in
ways that are patently unfair to individual students. And these kinds of
projections have no place in reports forwarded to colleges unless they are also
provided to the person who paid for the test—the test-taker.
ACT disagrees. In an email provided in response to a series
of questions concerning the new reports, ACT says,
“The recent enhancements we have made to the ACT Student
Score Report reveal more data than we’ve provided before. The chance of success belongs to the
institution, but if they provide permission, we will share that data on the ACT
High School Score Report and in turn, a counselor or administrator may share it
with the student.”
When registering for both the SAT and the ACT, students are
asked to provide a good deal of demographic information including everything
from zip code to grades in individual classes.
Assuming that GPA information is factored into “chances of success” along with scores, it may make sense for some
students to simply leave this information blank and thereby avoid the
possibility of being labeled as a potential failure before even being given a
chance to succeed.
In the era of big data, these kinds of intrusions and
assessments are bound to become increasingly problematic. A partial solution is to be aware of what’s
going on and play the game accordingly.
No comments:
Post a Comment