**PSAT Discussion Thread 2015**

Everyone may find this info useful:

National Percentile A (Nationally Representative Sample Percentile): The first of the three main percentiles released (and the one prominently displayed at the top of every student’s score report), it is also the most dubious. The College Board defines this (on page 6 of its score explanation doc.) as a percentile “derived via a research study sample of U.S. students in the student’s grade (10th or 11th), weighted to represent all U.S. students in that grade, regardless of whether they typically take the PSAT/NMSQT.” Translation: this sample includes scores from students who don’t typically take the test, i.e., those who are least likely to be college bound and who are most likely to score in the bottom quartile of test-takers.

National Percentile B (PSAT/NMSQT User Percentile – National): This is essentially the same as the Nationally Representative Sample Percentile, except that it is weighted to represent “students who typically take the PSAT/NMSQT.” Translation: this sample drops the scores of the students unlikely to take the test. Unsurprisingly, a student’s User percentile was, on average, about two points lower than their Nationally Representative percentile. For some students, the variance was as high as 5 percentage points.

NMSC Selection Index Percentile: When scores were released last week, educators were provided with a third set of percentiles, the “Selection Index” percentile. This was still calculated using a research sample, but the sample was limited to 11th grade students. This percentile is based on a student’s NMSC Selection Index (48-228) rather than the student’s scaled score (320-1520). For example, if a student earned a selection index score of 205+ out of 228, they scored in the 99th percentile using the selection index percentiles. If you don’t know what your selection index percentile is, you can find it on page 11 of College Board’s Guide to Understanding PSAT scores.

From: https://www.■■■■■■■■■■■■■■/blog/2016/01/14/can-you-trust-your-psat-score/

IN OTHER WORDS, none of the percentiles reflect a percentile against only college bound hopefuls (for the most part) as it did in the past. The closest would likely be the percentiles sent to educators - not to you in your report. And the percentiles you did receive could include students in 10th and 11th grade plus students who didn’t take the test. The National Merit requirements are BOUND to change given how easy it will be to score in a high percentile.

@GMTplus7 Yeah, but it’s sort of an educated guess ;).

@GMTplus7 Sure - but, the National Merit is a separate organization that actually hands out money for these scores. I really doubt that they will be pleased to find that NOW they have to hand out way more money for inflated percentiles - since they are obviously inflated and bear no relation to old scores. Will schools give full and half scholarships to semi-finalists if there are now so many in each state? I just doubt it. It will be much less meaninful, IMO. Has to be - it is nowhere near the same honor as last year to qualify for National Merit if the percentiles do not reflect actually test takers only like before. Which they do not. That is a fact, not a guess.

Or to think about it differently - if they keep the number of students winning to the same approximate percent of test of takers - which is what it has always been based on - it cannot possibly extend down to 97th percentile on the PSAT score guide you got Because that is not the same score as a score for just the 97th percentile of test takers. Not close. And that is what NM has been based on historically - the test takers and no one else.

This is a question for the National Merit org - not the test company.

@WayTutoring Applerouth jumped to conclusions a bit on those SI percentiles. He really doesn’t know where they came from.

Could someone please give me the link to the SI percentiles? Thanks in advance

Was just looking at the National Merit org guidelines - they are for the old tests currently as the guide was published in 2014. There is no information at all about how students will be selected from this Fall’s cohort.

Sorry above is incorrect - just nothing updated to reflect new percentile reporting, But it does say that only 50,000 will be chosen - that will not reach 97th percentile then.

At least - not the 97th percentile that you got at the top of your test report. It will be from the Selection Index Scores - which your schools received.

So according to the National Merit student guide for this year’s test takers - your score report does not tell you if your percentile is good enough. High school counselors probably have the info you need.

@WayTutoring Students received their SI scores on their score report. Applerouth’s numbers are educated guesses. No one knows the actual numbers except CB and NM.

I just re-read the 2015 understanding your score book again, and there is no way to discern the source of the SI percentiles in the table on page 11. There has been speculation that it is a research sample, but no way to know.

I personally think that they are accurate and based on the real data. I can’t comment on the shape of the distribution and thus predictions can’t be relied upon.

Everything gets funky when we start doing the concordance thing, and in the score booklet, it’s marked preliminary.

And so, we wait…

Let’s go back to the Applerouth article one more time. A GC calls and says my school average this year is 187, and last year it was 180. How do those compare? Well, take the 187, run it through the concordance tables, and a 180 pops out other end. Your kids performed the same as last year! Just as you would expect - score one for the concordance tables.

But wait. A 180 on last year’s SI percentile table was 88th percentile. A 187 according to this year’s SI table is 91st percentile. Your school average increased by 3 percentile points!

A 3 point increase in a single year is incredible. Like in not credible. Like in makes you go hmmm, something smells funny, lemme see the data behind that table. Or maybe their kids just did a lot better this year. Maybe they teach common core better than everyone else.

Every year there’s a bullet on the SI chart stating the source of those numbers. This year that bullet is conspicuously absent. Where did those numbers come from?

@Dave_N, you are right about data reported by CB

from this link https://collegereadiness.collegeboard.org/psat-nmsqt-psat-10/scores/student-score-reports, I see

“Score Release Schedule
2015 PSAT/NMSQT online score reports were released Jan. 7. If you need an access code to view your scores, your counselor can provide it.
Paper reports will arrive at your school by Jan. 29.
If you take the PSAT 10, you’ll get your scores about two months after test day.”
This report is based on real data, since it was released to media many days after Oct 2015 PSAT date.

@WayTutoring I know you are late to this thread, but what you posted is nothing this corhort of posters didn’t already know. Also, NMS is not going to give out double schlorships nor use stupid %'s calculated by college board. They don’t need to. All they need is an excel spreadsheet with 1.6 million students listed in rows with their Selection Index score listed in columnn B. Sort by SI, highest to lowest. When they hit about row number 50,000 (up or down a few rows so that all students of the same SI are covered) they look at that column B and say “Commended cutoff = That number”! Yeah. We have commended.

Then for SF, they calculate out a % allocation of SF’s based on number of high school seniors and 16,000 (which is the number of students they need for SF). Likely very close to the numbers from last year which can be found here: http://www.nationalmerit.org/annual_report.pdf. Then they take that excel spreadsheet and break it up by state and sort until they get as close as they can to the number of students they are supposed to allocate to that state - and presto. They have state cutoffs.

All the college board math that we are trying to extrapolate info for is not important for NMF. Only for parents who are trying to figure out if their kid is going to make $$$ in this contest.

MODERATOR’S NOTE:
Please don’t feed the guys under the bridge; report the violators instead. Thanks.

@suzyQ7 Amen to that. People itt are giving way too much weight to concordance tables and rumors of how high scores are. And people don’t seem to understand how scaled scores and percentiles work if they think a 210 is going to be commended.

@gusmahler So 210 won’t be commended? Or will it be semi-finalist?

Since Applerouth published their analysis recently – https://www.■■■■■■■■■■■■■■/blog/2016/01/14/can-you-trust-your-psat-score/
& it has been commented about on this thread, I thought we might look back at their post on October 21 - after the PSAT of Oct. 14:
https://www.■■■■■■■■■■■■■■/blog/2015/10/21/new-psat-captures-the-hearts-and-memes-of-todays-teens/

Students comments were gathered ----
“The Critical Reading passages were more challenging than on previous PSATs, reflecting the College Board’s increased commitment to enhanced textual complexity. Like the four SAT practice tests made available by the College Board, this PSAT boasted one particularly hard passage. In this case, the higher difficulty passage was a Frederick Douglass speech from 1852.”

“We heard complaints about the 60-minute length of the Reading test and the difficulty of pacing oneself on this now-longer reading section. Others found that the charts, tables, and graphs questions–the College Board’s foray into ACT Science-like content–were significantly harder on this PSAT than those on the provided sample test.”

Math

The two Math sections on the PSAT elicited different student reactions. The vast majority of students had concerns about section 3, the no-calculator Math section. The new test allows only 25 minutes for the 17 non-calculator math items. Several students wrote in online forums that they had more time to fill in their personal information than they had to take the no-calculator Math section. This section demanded much harder hand calculations than had appeared on the SAT or PSAT practice tests, making things particularly difficult for students who are accustomed to relying heavily on their calculators. Long division, multiplication and division of multiple decimals proved to be very time-intensive for many students. Additionally, some of the more complex algebra items surprised and challenged students.

Writing was said to be similar to practice materials & the past tests.

Also from the Applerouth website – “When scores are returned, many students will want to know if they are within the range of National Merit Scholarship consideration. This year, the National Merit cutoff scores will be based on a new Selection Index, which can be calculated by adding up the three section scores (each on a scale of 8-38) and multiplying the sum by 2. This gives a maximum possible Selection score of 228, a number relatively close to the familiar 240 point maximum scaled score of the old PSAT. National Merit cutoff scores will vary by state, and we expect the scores to range from the high 180s to the low 210s, based on the performance of students in each state., based on the performance of students in each state.”

Read more here - students reactions after the PSAT: https://www.■■■■■■■■■■■■■■/blog/2015/10/21/new-psat-captures-the-hearts-and-memes-of-todays-teens/

So Applerouth had predicted a lowering of scores for the cut offs along the lines that Prep Scholar did - but now they seem to be changing their minds based on some data they are seeing. But the feedback they got after the test is also “real” & may be the curves reflect some of the challenges students had in reading & math (little curve in writing).