Stanford, Harvard, Dartmouth, Yale, Penn, Brown, CalTech, JHU, and UT-Austin to Require Standardized Testing for Admissions

The “country”, particularly the education establishment, has no interest in administering a nation-wide standardized test.

3 Likes

I certainly didn’t say that. I don’t understand why people who have no experience working in a HS and administering testing have such strong opinions on what it takes to run tests.

Like I’ve said before, if colleges want the tests, they should be part of the testing solution (and some are test sites). Seems simple. It also seems simple that if we want social emotional counselors to run tests they should get paid for that. Who is going to sign up to make that happen? Or add that testing coordinator headcount?

2 Likes

I have proctored tests but thanks for assuming.

I don’t have strong opinions about what it takes to administer tests. I have keen skills of observation, have lived in a bunch of different places, and have come to realize that some things are as complicated as the community wants them to be regardless of how complicated they actually are.

It’s like providing accessible options on election day. There’s always a big hue and cry about how hard and complicated it is- until you ask the actual poll workers who point out the wheelchair ramp, the accessible gender-neutral rest rooms, and the designated parking spaces for vans with lifts. Suddenly it’s not that hard.

2 Likes

Proctoring is a small percentage of the test administration process. Like I wrote above, counselors estimate 30+ hours of their time spent per Saturday test administration. Some of that will decrease with digital SAT administration though which is good.

1 Like

I give up.

Yes, without a dedicated counseling team which is getting paid overtime to plan, administer, clean up, verify ID’s, kick out the kids who are texting each other the answers, etc. which will take 30+ hours PER STAFF MEMBER, no school can possibly offer the SAT.

The schools which use the Assistant Principal, the coach, the gym teachers, a random foreign language teacher, etc. to do those activities must be reported immediately.

It likely takes hundreds of hours to plan the senior prom-- somehow nobody complains that prom is too time consuming!

1 Like

I didn’t say no school can possibly offer an SAT, because plenty do…but is one of the reasons why testing seats are down post-covid.

Again, no horse in this race. I don’t disagree…but generally in my neck of the woods it’s parents planning prom and similar events.

1 Like

Please take the back and forth to PM. Since debate is not permitted on CC, further posts will be deleted without comment. Thank you.

1 Like

I think we all recognize the excuses are just cover for why California doesn’t have many test sites. Either schools are opposed on principle, or they are terrified of what those tests would actually show with respect to its public education system, or likely both. In any event, many elites are bringing the tests back, and likely Harvard and Pton will follow. Those applying will figure out a way to do it if they can.

No doubt there are tens of thousands in China who can’t take it either. Life can be unfair. There are alternatives.

5 Likes

Who pays the proctors for tests that fall outside of the school day? I always assumed it was CB (or ACT). Is it the host school?

Correct me if I’m wrong, but doesn’t a student need a computer capable of loading bluebook if they are to take the dSAT (or dPSAT) on their own device? Wouldn’t this be a barrier for low-SES populations? A personal chromebook isn’t capable of downloading bluebook, only school chromebooks can (at least where I live.)

My somewhat diverse public school (20% on free/reduced lunch, 80% of students go to 4-year colleges, and we’ve gotten multiple targeted mailings from Questbridge) has enough chromebooks at school for everyone to test on school-devices. Kids weren’t allowed to use their own devices, even if they owned one, for the PSAT in October. I’m assuming it will be the same for our school-day SAT later this month. But if these kids were to test on a Saturday at a location that required you to bring your own device, how would that work? My own student only has a chromebook, not a computer. They would have to borrow my husband’s work computer if they needed a device to test on.

Thankfully, the threat of the SAT going digital was enough to encourage my junior to test on paper this fall so we don’t have to stress about all the changes to testing now. And our high school does do school-day SAT testing for all juniors. I’m just thinking out loud here since everyone is talking about testing site availability.

1 Like

A way private colleges that now require tests could be part of the solution wrt lack of test sites would be to offer free digital testing to all students they fly-in (who want to).
They could also use ALEKS instead of the SAT if math is the section that matters most - as may be the case for STEM majors.

Most schools have the apps from college board for PSAT 8/9/10/11. So adding digital SAT should not be hard once they do the hard yards initially. Our school system worked hard to add the digital capability for PSATS and it should be easy from next year.

1 Like

The title of this thread is going to keep growing and growing…

3 Likes

Those are pretty striking results, especially in light of previous statements from around the landscape.

I’m trying to reconcile the results of the UCs (they saw little difference between their TO vs pre-TO year grad rates/gpas) vs the UT Austin [edited] results. Being that they are two successful and popular systems, I thought the results should be more similar. Has any article/writer attempted to cross this chasm? Potential factors do include the 6% rule, differing amounts of grade inflation… I can’t think of much here.

Or is it just how they are slicing and dicing: Were the UT Austin [edited] cohorts already in existence pre-TO, but TO allowed them to label these cohorts and identify the difference in their FY gpa? Is UCs test-blind policy not allowing this same sort of labeling/identification?

1 Like

fyi, UTA is a different school (refers to UT-Arlington). UT is the Texas flagship.

1 Like

There are several relevant factors. One is what criteria is compared between test optional and test submitters. UT reported GPA for the first semester at the college . First semester GPA is expected show the largest difference. Cumulative 4-year GPA is expected to show a notably smaller differences than first semester GPA. Graduation rate is expected to show the smallest difference of all. All selective colleges I am aware of reported no significant difference between cumulative GPA and graduation rate between test submitters and test optional admits.

Another important factor is what criteria is being used to admit test optional applicants. If the admission criteria for test optional applicants is solely having a high class rank (top x% rule) or having a high GPA in isolation, then there are expected to be notable differences between test submitters and test optional applicants. If the criteria instead includes things like course rigor, having adequate HS preparation for prospective major, considering harshness/leniency of HS grading, which courses had higher/lower grades and how relevant they are to major, upward/downward trend, LORs, ECs/awards in fields relevant to major, essays, … then there are expected to be much smaller differences.

Another important factor is whether the test optional and test submitter groups were equally selective. For example, if you are admitting relatively lower score kids who also are generally weaker applicants, you’d expect different results from if you are admitting relatively lower scoring kids who are equally strong as test submitters in the other application criteria. UT reported a 300 point difference in test scores between test submitters and test optional applicants, which is a far larger difference than I’ve ever heard of at a highly selective college. This make me expect that there is something unique about UT’s admission criteria for test optional applicants.

4 Likes

Apologies! Edited above (and thanks :slight_smile: My spidey sense was tingling but I ignored it at my own peril).

2 Likes

That is the expected downside of a top X% auto admit policy. Some students are from good schools and performed well; many only performed comparatively well at very poor schools and are leagues behind their new classmates. It doesn’t alter who gets in, just the academic profile of the class.
The same will occur for any high school or college using such admisission criteria, including TJ in Fairfax

4 Likes

These are useful thank you. Has an article scrubbed these to see if anything (more apples to apples comparison) was there? I’m sure UC (esp UCLA/UCB first year gpa info) exists in some form.

I’m surprised this isn’t scrubbed to death already in either the papers or on twitter. Sadly I haven’t been able to look thoroughly.

So if it was an artifact of the 6% rule, do you feel these types of student profiles/results were already in place previous to TO?

Yes, the difference was some of the admitted students were properly placed in remedial or basic classes when the school had their scores, rather than the general classes.
75% of the class is admitted under the top 6% rule. Another 200 recruited athletes are admitted. Another 5% international.
That is the vast majority of the class.

1 Like