There’s lots of news coming out of the College Board this week. As promised, scores for the first administration of the new or “redesigned” SAT are on track for release tomorrow.
But to get ahead of questions that are bound to crop-up about how the new SAT compares to the old SAT and the ACT, the College Board needed to generate concordance tables to provide comparisons families and colleges could use to determine achievement relative to the existing tests.
And so today, the College Board announced the creation of 16 different concordance tables covering the span of relationships between the new SAT and the old SAT as well as the new SAT and the ACT (for ACTs given prior to September 2015).
To support the tables, the College Board also produced what they are calling the “SAT Score Converter” mobile apps available for free in in the iPhone and Android app stores. And if you don’t want to install an app for the purpose of comparing test scores, the tool is available online on the College Board website.
For those not familiar with the term, concordance is a way to compare scores from different assessments. Because the new SAT is a different test from the old SAT, it’s not possible to perfectly compare the two test scores. A score of 630 on the Critical Reading section of the old SAT will almost certainly not be equivalent to a score of 630 on the Evidence-Based Reading and Writing (EBRW) section of the new SAT because according to the College Board, “each assessment tests a different domain of knowledge and skills.” Used properly, concordance tables can offer estimates of score correspondence from one test to another.
The new tables come with an astonishing amount of verbiage, a good deal of which is not terrifically useful for simple understanding or explaining why the scores are they way they are. The most obvious take-away, however, is that the new scores look significantly higher than the old scores.
For more visual learners, Jon Boeckenstedt, associate vice president for enrollment management at DePaul University, used Tableau to produce a really useful “visualization” of the relationships between various tests. Each of the 16 tables has been converted into a separate “dashboard,” and by “hovering” over individual dot points, you can see associated scores. But it’s the pattern of score separation that’s most interesting.
And the test results certainly “appear” higher. In 2015, the national means were 500 for Critical Reading, 510 for Math and 480 for Writing. On the new tests, those scores relate to 550 for EBRW and 540 for Math--or 100 points higher when combined!
In fact, a quick and “uneducated” review of the new test scores could easily lead to the conclusion that a student did much better than anticipated on the SAT—similar to what happened with the new PSAT in the fall—as the difference between the old test and the new approach 100 points when combining the math and verbal scores.
Most admissions professionals feel confident that colleges will know the difference between the two tests and will most likely install software that will automatically convert scores during the reading process. They’ve been comparing SAT and ACT scores this way forever.
But where there may be a problem is how families and applicants could possibly misinterpret scores from the new test. A family that accepts results from the new SAT at face value and either doesn’t know about or doesn’t care to use concordance tools to convert the scores may come to the conclusion that the test-taker has done much better than originally anticipated. And this could result in unrealistic college lists and some disappointment down the road.
“We just have to pound the drum that this is a totally different test. New SAT scores are no more directly comparable to old SAT scores than SAT scores are to ACT scores,” said Adam Ingersoll, of Compass Education Group. “Students—and colleges—will HAVE to use the conversion tool to make sense of it.”