The Misguided War on the SAT

His blog posts are not focused on Oregon State.

Oregon State is test optional. Even prior to 2020, test scores were only “considered” but not deemed an “important” or “very important” factor for admission decisions. In 2019, roughly half the admitted students scored below 1200 on the SAT.

His blog posts are aimed at admissions generally across the country in institutions of all types. He is remarking here on schools that admit, or historically admitted, a minority of applicants using scores as a means of gatekeeping.

1 Like

A great analogy!

We can close our eyes to the teen pregnancy problem by going “test optional”, but those babies are still gonna be coming due.

3 Likes

Reminder that cc is not a debate society. If you find yourself going back and forth with a poster, move it to PM.

A number of posts responding to a flagged and hidden post have been deleted.

I work with black high schoolers, and it pains me to see them fall for the obvious mistake of not sending in their scores while predominately white/asian/and high-income students send theirs in.

You hit the nail on the head! The problem with SAT scores is that they are showing us something that many do not want to see.

9 Likes

I’m late to this discussion but what frustrates me is that the NYT article and the cited study use data only from a very small number of the most selective colleges in the US. I analyze college success data for a large public university. It is a well-regarded, moderately selective university and while SAT/ACT scores do help predict first year grades, they do a terrible job of predicting whether a student will earn a degree. “Terrible” = not statistically significant in a multivariate regression model.

A school like MIT has almost no variability in student HS GPA. How would you expect GPA to predict outcomes when everyone has a 4.0?
In our data we frequently see students with 1400+ SATs and 3.0 HS GPAs drop out of college with poor grades. We also see students with SAT scores of 1050 who earned 3.8+ GPAs in high school who absolutely thrive in college, earn degrees, and have successful careers.

SAT/ACT may identify a certain level of aptitude, but grades, even with some grade inflation, also measure a student’s ability to work hard. In fact our numbers can only replicate the Chetty article when I narrow the sample to students with an UW HS GPAs of 3.8 or higher and SAT scores north of 1300.

The state of higher education would be much improved if we collectively stopped fixating on the Ivy Leagues.

12 Likes

One thing to consider is that when US applicants apply to UK colleges, which are very stats driven, more weight is given to AP scores than SAT scores. APs are another normed national test, with a higher level of content but not a very high ceiling for distinguishing between applicants (a significant fraction of test takers score 5).

However many high schools don’t offer a wide range of APs, many others have the majority of their students failing even to pass the AP tests. That’s generally an indictment of high school quality and suggests that their level of preparation for college is very poor (acknowledging that some privileged private schools think they can do better rather than teaching to the test).

Comparing SAT and AP tests, it seems likely that the former is going to better at identifying the “diamond in the rough” (scores are less likely to be negatively affected by poor teaching since the material is easier and less prior knowledge is required) while the latter is going to be better at measuring readiness for college (in terms of working hard and absorbing college-level material).

3 Likes

In 2018, the median national family income was just above $60K.

I also went back and looked at the 2020 UC Berkeley CDS, which is the last year that students submitted tests. And that year, their 25-75 ACT spread was 27-35. So going back to Jon B’s chart, I have marked the set of students from families below $60K and have ACT scores above 27 (and remember that 25% of their admitted class is below that).

The outlined part represents perhaps 20% of all students scoring above 27. There are two ways to look at this, both of which I consider valid. One is that it’s really unfortunate that only a fifth of potential applicants at or below median income are within the ACT score range that makes admission more likely. I don’t know anyone here who would disagree with that. (Note that median income is higher at $73K, but I am still using a $60K cutoff).

The other way to look at it is that there still are plenty of below-median income applicants applying to Berkeley and having a good chance of admission there. Berkeley would take a chance on the 27 ACT score from a less affluent school system, whereas it would expect much better performance from a student at say Palo Alto High School.

But in any case, the fraction of diamonds in the rough is certainly much more than the 1 in 500 number that Jon B threw out.

2 Likes

Most schools’ minimum math requirements (if any) do not require calculus, like MIT does. In most schools, a student in a major without math could graduate with the math requirement fulfilled by an AP statistics level course, and science requirements fulfilled by “physics for poets” type courses. SAT math scores are likely to have little predictive power for such students.

3 Likes

Thanks for joining the discussion. Did you also take a look at the final GPA of students that did graduate, and was there any significant difference there, by SAT/ACT score?

Also, if you have time, I would welcome your review of the report created by the University of California’s Standardized Testing Task Force. They found that standardized testing was useful for them, and would welcome your insight on what might make their conclusions different from yours. Individual college selectivity is one obvious difference, but I wonder if it goes beyond that.

1 Like

I would think that there would be a strong statistically significant correlation between first year grades (being high) and earning a degree.

At your public university, the kids who do worse in their first year are more likely to graduate?

Do UK universities have an overflow of applicants whose stats are pressed up against the ceilings? If not, then the ceilings are not too low for their purposes, even if they may be too low for a highly desired US university that admits purely on academic merit and achievement (on the other hand, lower ceilings may be convenient for many US universities that need a way to hide hooked students).

Thanks. Admittedly, I hadn’t looked at GPA upon graduation before. Typically we only report on first term and first year college GPA, retention, and graduation.

I did a quick analysis and I did find a significant difference in graduate GPA by SAT scores. Among students with a HS GPA of 3.8+, a graduate who entered with a 1050 SAT earned an average of a 3.1 GPA in college. A student with similar high school grades and a 1500+ SAT earned an average GPA of 3.8. The SAT impact on graduate GPA was also visible for graduates with lower HS GPAs, but less extreme (2.8 average graduate GPA at 1050 SAT and 3.2 for 1500+). Is that difference meaningful among graduates? Probably for getting into graduate schools, but probably less so for getting a job.

Admittedly I skimmed the 220+ page document, but I agree with their findings on college GPA. However, there are two things that might explain the differences in their findings and mine.

The first is that they have a relatively large number of students with SAT scores below 1000. We have very few students below 1050. Looking at their graphs, much of their variability in outcomes come from those low SAT students. I have seen in our data suggestion that there may be a minimum SAT/ACT for success in college. Ours appears to be around the 1000 SAT mark.

The second is their regression models appear to only use SAT and HS GPA. I include sex, first gen, socioeconomic status, race, and year admitted. As their report notes, standardized scores vary quite a bit by demographics, and my model may be “canceling out” the effect of standardized tests that they see.

6 Likes

You have hit upon one of the great paradoxes in our model. HS GPA + SAT are great predictors of first year GPA. Once you start college, the single best predictor of earning a degree is first year GPA. However, SAT does not predict graduation within our model. The transitive property doesn’t hold.

Keep in mind that even the best models for predicting graduation only account for about half the variability in student performance. We still aren’t great at measuring grit, good study habits, organization skills, and hard work. I have seen studies where students who score higher on a “hedonistic” survey metric are much less likely to graduate.

Some of the best predictors we have found for student graduation rates outside of typical metrics are level of homesickness, hours spent studying, and whether the student maintained a calendar of all their class assignment due dates and exams.

6 Likes

Adding to @hebegebe’s questions, @pilate your initial post seems to suggest that students with poorer SAT/ACT scores are as likely to earn degrees as those who score better, thus rendering SAT/ACT a “terrible” predictor of graduation rates. Did you also look at whether the former are more likely than the latter to earn degrees in majors different from their initial majors? Or more specifically, more likely to graduate as non-STEM majors when they enter college as STEM majors? This would address the issue of graduation rates not telling the whole story and paint a more accurate picture of SAT/ACT’s predictive value. There was discussion on this earlier but no studies appear to have been conducted.

Oxbridge has an overflow of US applicants with lots of 5s on APs and an overflow of UK students with sufficient qualifying grades in A-levels (typically offers are A*AA). They use additional tests and interviews to distinguish between these candidates. Other universities are able to make offers based purely on A-level results or AP scores. As you’ve noted before, the share of UK students able to attend Oxbridge is a higher proportion of the population than the share of US students able to attend top privates.

Its easier to get a 5 on AP than an A* in A-levels (and A-levels are harder, with more content and longer exams, typically 2x3 hours). For example, Further Maths (self-selected like Calc BC) gives ~30% A* vs 43% 5s in Calc BC and Maths (less selected) gives ~17% A* vs 22% in Calc AB. If you take the overall scores on all exams, then roughly 11% of A level students (10% before Covid, 12% last year) get at least 1 A* plus 2 As, compared to ~6% of US high school graduates becoming an AP Scholar with Distinction. But A level students are a much more selective group (just under 40% of 18 year olds) vs US high school graduates (around 70% of 18 year olds), so that’s a roughly comparable share of the population getting good grades overall.

Many employers use a cutoff GPA to prioritize college applicants for job interviews. The most common cutoff GPA is 3.0.

I.e. the difference between 2.9 and 3.1 is likely to be much greater in job seeking than the difference between 3.1 and 3.3.

A similar pattern occurs in almost all research. Both SAT/ACT and grades are quite poor at predicting which students will graduate. There isn’t zero correlation, but the correlation is quite small. Whether a student graduates depends on things like whether the student can pay for college, whether the student has a family emergency, whether the student likes to spend his/her time drinking and partying instead of studying, … It’s often a very different question than whether the student is more likely to get a 3.x GPA or a 3.y GPA in their freshmen year.

For example, the UC task force report found the following. SAT + SES only explained 4% of variance in 4-year graduation rate. It was still only 8% after adding in HS GPA. Colleges that look at 6-year graduation rate show a similar type of pattern, with little predictive ability.

Predictors of 4-year Graduation Rate in UC System
SAT + SES – Explains 4% of variance
HSGPA + SES – Explains 7% of variance
HSGPA + SAT + SES – Explains 8% of variance

The highest numbers I’ve seen are in the study at https://heri.ucla.edu/DARCU/CompletingCollege2011.pdf, which uses answers to a freshman survey to estimate graduation rate. And more importantly they predict the average graduation for the full college, rather than the graduation rate for individual students. Specifically they found

HS GPA – Explains 14% of graduation rate.
HS GPA + SAT – Explains 17% of graduation rate
HS GPA + SAT + Demographics – Explains 18% of graduation rate
Full Model + SAT – Explains 26.9% of graduation rate
Full Model without SAT – Explains 26.8% of graduation rate

Some of the more influential survey questions in increasing graduation rate appear to be:

  • Being Asian, not being URM
  • Not being a native English speaker (less strong for international)
  • Being Jewish or Catholic
  • Did not attend a public charter HS
  • Parents are not divorced
  • Majoring in education or business, not majoring in health or engineering
  • Living in residence hall or fraternity/sorority; not living off campus
  • Not planning to transfer out
1 Like

It depends on the job. The “prestigious” jobs referenced in the original article will use a much higher GPA cutoff. My S’s (consulting) firm now uses a 3.9 GPA cutoff for interviews. It used to be 3.8 before COVID, but they wanted to filter more aggressively, because they have found that the best predictor of success amongst new hires is college GPA.

My takeaway from @pilate’s comment is that SAT may be a better predictor of success in higher level courses which require more depth understanding (and raw intelligence?), not just hard work. And those 3.8UW students with 1500+ SAT scores might be characterized as the “smart slackers” in high school (or ADHD kids :wink:). Whether hard work or raw intelligence is more valuable in the workplace after college will clearly depend on the job. Most employers if pressed to prioritize one of the two might opt for hard work.

3 Likes

My main takeaway is that having a high HS GPA is a better predictor of earning a degree.

Major is absolutely a factor in all of this. However following students who change majors can be tricky. Do we count students who wait until the end of sophomore year to declare a major? If they changed major did they do so due to a change in interest or were they failing out of their program?

I have looked at engineering students and they have a strong tendency to change majors after struggling in either Chemistry or Calculus. The vast majority switch to business and they do very well. They tend to have strong HS GPA and SAT scores, relatively low college grades, but strong graduation rates.