right arrow
Examples: Monday, today, last week, Mar 26, 3/26/04

Are we positively sure that GPA and test scores do not reflect intelligence in college?

OneWhoTalksOneWhoTalks 22 replies24 threads Junior Member
At what point do we consider test scores not useful in gauging one’s true intelligence?
36 replies
· Reply · Share
«1

Replies to: Are we positively sure that GPA and test scores do not reflect intelligence in college?

  • ucbalumnusucbalumnus 78600 replies697 threads Senior Member
    edited October 9
    At what point do we consider test scores not useful in gauging one’s true intelligence?

    Any proxy measure of intelligence or anything could have varying levels of accuracy, depending on the thing being measured and the design of the proxy measure. For example, assumed prerequisite knowledge may or may not have been taught equally to everyone. In addition, when the proxy measure is prepped for, it can measure prep activity as well as what it is intended to measure.

    In other words, the tests may measure intelligence, but not in isolation from some other things that affect test performance, and may not measure all aspects of intelligence.

    Regarding GPA, it may not be so much a proxy of intelligence, but a proxy for how one is expected do in future schooling.
    edited October 9
    · Reply · Share
  • RichInPittRichInPitt 1126 replies16 threads Senior Member
    Intelligence is a bit of a nebulous concept, so there won’t be a definitive answer to the similarly fuzzy question of being “useful in gauging one’s true intelligence“

    It has certainly been shown that there is a significant correlation between SAT test scores and “IQ” scores, even though they measure different things and have different purposes. R=0.7 to 0.8, IIRC.

    Correlation to GPA wouldn’t make as much sense because it’s not a universally standard measurement. I don’t think anyone would argue that a 3.2 at any school is the same. I would suspect a 3.6 at Harvard, MIT, etc. to have a higher “IQ” score than a 3.8 at our local community college, as a broad generalization.
    · Reply · Share
  • RiversiderRiversider 867 replies103 threads Member
    MENSA acceptance means SAT is reliable, specially single test. GPA is school specific but continuously good GPA at different institutions shows academic ability.
    · Reply · Share
  • lookingforwardlookingforward 34514 replies383 threads Senior Member
    Why ask? You can be super intelligent and bomb std tests.

    And in admissions, it'snot about IQ. What shows more is what you actually chose to do and accomplished.

    All std test scores truly show is the effort you put into them. There may or may not be correlation between the sorts of kids who do take the tests seriously, versus those who blow them off. Or just don't take to the format.
    · Reply · Share
  • 1NJParent1NJParent 1449 replies35 threads Senior Member
    Riversider wrote: »
    MENSA acceptance means SAT is reliable, specially single test.
    MENSA doesn't accept post-'95 SAT. SAT has become since then more of a knowledge test, rather than an aptitude test.

    · Reply · Share
  • CompEngGirl123CompEngGirl123 49 replies4 threads Junior Member
    lookingforward said: "All std test scores truly show is the effort you put into them."

    Very false. I made A LOT of effort preparing for the ACT (including taking an ACT prep class offered through my school and taking more timed practice tests than I could count including tests from the Real ACT Practice Test book), yet the highest score I could get was a 26. Based on my 4.0 UW gpa and my course rigor, I originally thought that getting a 30+ would be a good goal for me, but I was pretty much capped at 26.

    All preparing for a standardized test does is help a student reach his or her potential. Preparing should improve a student's score, but the fact is no matter how much students prepare for standardized tests, most are not going to get a ACT score of 36 or SAT score of 1600. Because, yeah, being familiar with the test format and developing some test taking strategies can help students answer more questions in a short period of time, but it can only do so much (some students naturally think and work faster than others).

    Also, a bit off topic, but I have to say I sometimes do wonder what would have happened if I took the SAT instead. On my first ACT practice test I scored a 21 with little preparation, but on the PSAT I scored a PSAT score equivalent to an ACT score of 26 with little preparation. However, even though I did much better on the PSAT, the "take off 1/4 point for each incorrectly answered SAT question" rule scared me out of taking the SAT. I still got into my first choice college though, so everything ended up being okay at the end :)
    · Reply · Share
  • ucbalumnusucbalumnus 78600 replies697 threads Senior Member
    1NJParent wrote: »
    Riversider wrote: »
    MENSA acceptance means SAT is reliable, specially single test.
    MENSA doesn't accept post-'95 SAT. SAT has become since then more of a knowledge test, rather than an aptitude test.

    The 1995 change was just a recentering of the scores.

    But even SATs before then were knowledge tests. The verbal section was mostly a vocabulary test. Questions were easy or hard based on whether you knew the words. The math section was about algebra and geometry.

    Now, intelligence mattered in how well you learned those things. But other inputs besides your intelligence mattered, such as the quality of math courses in your school and how much your English courses taught vocabulary directly or indirectly.

    Of course, intelligence in aspects other than those measured by the SAT did not matter for the SAT.
    · Reply · Share
  • 1NJParent1NJParent 1449 replies35 threads Senior Member
    ucbalumnus wrote: »
    But even SATs before then were knowledge tests.
    The march away from aptitude testing has been constant. Each revision made the test less of an aptitude test. The current version distinguishes little from ACT in testing learned knowledge. Is there any wonder why test preps are so prevalent today than in the past?
    · Reply · Share
  • privatebankerprivatebanker 5346 replies78 threads Senior Member
    It depends on the goal.

    It measures some base level of intelligence. However it’s not even the most valuable intelligence trait.

    Most leaders and trailblazers have incredible EQ as the most important trait. Of course having high scores in both is ideal.

    But pure book smarts usually has you working for people with variations of the other combos of intelligence.

    · Reply · Share
  • ucbalumnusucbalumnus 78600 replies697 threads Senior Member
    edited October 10
    1NJParent wrote: »
    ucbalumnus wrote: »
    But even SATs before then were knowledge tests.
    The march away from aptitude testing has been constant. Each revision made the test less of an aptitude test. The current version distinguishes little from ACT in testing learned knowledge. Is there any wonder why test preps are so prevalent today than in the past?

    https://www.erikthered.com/tutor/sat-act-history.html contains some SAT and ACT history, including sample questions from the 1926 SAT. Those questions still required knowledge typically learned in English and math courses in school.

    Test prep is more common because tests are more important to more people than before (more want to go to more competitive colleges).
    edited October 10
    · Reply · Share
  • 1NJParent1NJParent 1449 replies35 threads Senior Member
    ucbalumnus wrote: »
    https://www.erikthered.com/tutor/sat-act-history.html contains some SAT and ACT history, including sample questions from the 1926 SAT. Those questions still required knowledge typically learned in English and math courses in school.

    Test prep is more common because tests are more important to more people than before (more want to go to more competitive colleges).
    Yes, some knowledge is always required regardless of versions, but the point is each revision of SAT made the test more dependent on knowledge than aptitude.

    Test scores are supposedly less relevant today than in the past, so why more students spend more time and other resources to prep for tests? They wouldn't be doing it if test prep doesn't help much. You hear stories these days on CC and elsewhere that some students improve their SAT scores by the hundreds. That never happened before.
    · Reply · Share
  • ucbalumnusucbalumnus 78600 replies697 threads Senior Member
    1NJParent wrote: »
    Yes, some knowledge is always required regardless of versions, but the point is each revision of SAT made the test more dependent on knowledge than aptitude.

    What examples can you give to support this claim?
    1NJParent wrote: »
    Test scores are supposedly less relevant today than in the past, so why more students spend more time and other resources to prep for tests?

    Test scores may be less important in a relative sense, but more important in an absolute sense, than before. Many colleges a generation ago were not that selective, even if they weighed test scores more than GPA. These same colleges, even though they may weigh GPA more than test scores now, are more selective that they now require higher test scores than before.

    Test prep is also now much more accepted as part of the college admissions game now.
    · Reply · Share
  • momofsenior1momofsenior1 7620 replies61 threads Senior Member
    When did your kids start getting their first standardized assessments? Mine had hers in 1st grade with the CogAt. Then it was state testing, repeat CogAt, entrance exams, etc... She's been consistently 98th-99th percentile on every assessment she's ever taken, up and including PSAT/ACT. I don't know if she's an outlier or if it's common for kids to be consistent across different different tests. If that is more of the norm, then I would say tests are predictive.
    · Reply · Share
  • LindagafLindagaf 9382 replies502 threads Senior Member
    edited October 10
    One of the most intelligent students I have ever met applied test optional at a very rigorous and well known LAC. He graduated summa cum laude as a double major, has won a prestigious scholarship as a post grad, placed second in a a very difficult international competition, etc....

    As an experienced test prep tutor, I continue to be surprised by students who I know are highly intelligent and/or even gifted, but can’t get super high scores on the SAT or ACT. These tests are artificial and are not IQ tests. Of course, in general, more intelligent kids do better on these tests.

    I am no expert on intelligence, but I do not believe that a single SAT or ACT accurately reflects intelligence. Sure, if a student consistently scores highly, as did @momofsenior1 ‘s child, that probably establishes a student’s intelligence.

    My own son was given extensive evaluative testing at ages 7, 10 and 13. He had the vocabulary of a 21 year old, (their words, not mine), but he has two learning disabilities. Several years later, he scored in the 99th percentile on both the ACT and SAT. His GPA hovered in the low 90s-high 80s, probably due to the combination of learning disabilities and laziness. He is not exceptionally intelligent. He just happens to have me, a test prep tutor, as a mother, and yep, I tutored him for the tests. He has two college educated parents who read to him every day for years and years. I ask my kids if they are ready for sustenance rather than dinner, because it amuses me.

    I see hyper intelligent kids overthink the SAT/ACT all the time. I worked with one kid recently who seemed very not-highly intelligent. His grades were mediocre. He had an uncanny ability to pick the right answers most of the time, because he simply looked at the trees instead of the forest. He didn’t even try. It blew my mind. There are lazy kids who are really smart. There are dull kids who simply don’t get flustered by tests and find the obvious answer.

    My opinion is that a kid who shows consistent high performance in school is probably the best indicator of intelligence. Standardized tests like the SAT/ACT do not indicate intelligence the majority of time.

    @OneWhoTalks , I notice you have 24 threads and fewer replies. Don’t be one of those posters who is trying to rack up responses without ever responding. Are you researching?
    edited October 10
    · Reply · Share
  • ucbalumnusucbalumnus 78600 replies697 threads Senior Member
    IQ as measured by IQ tests, is not everything.
    https://www.wired.com/story/forget-mensa-all-hail-the-low-iq/
    · Reply · Share
  • roethlisburgerroethlisburger 2854 replies155 threads Senior Member
    edited October 12
    There’s lots of semantic debates about how to define intelligence. What we do have is lots of evidence that standardized test score percentiles don’t change much over time for individuals and they do predict educational attainment. Your 7th grade math ACT/SAT score is a very good predictor of whether you get a STEM PhD.

    https://my.vanderbilt.edu/smpy/files/2013/01/Article-NATURE-2016.pdf
    edited October 12
    · Reply · Share
  • roethlisburgerroethlisburger 2854 replies155 threads Senior Member
    My kid added 180 points to their PSAT scores in one year, and then scored 70 point higher than that on their SATs. I am absolutely certain that my kid did not gain 15 "IQ points", or whatever, between the fall of their sophomore year and the spring of their Junior year

    SAT scores aren’t directly comparable to PSAT scores. A better way to compare them is percentiles. If your kid scored in the 84% on the PSAT and the 98% on the SAT, which is a one standard deviation jump, I’m willing to bet they are an extreme outlier in showing that much of a change.
    · Reply · Share
  • MWolfMWolf 1700 replies10 threads Senior Member
    edited October 12
    My kid added 180 points to their PSAT scores in one year, and then scored 70 point higher than that on their SATs. I am absolutely certain that my kid did not gain 15 "IQ points", or whatever, between the fall of their sophomore year and the spring of their Junior year

    SAT scores aren’t directly comparable to PSAT scores. A better way to compare them is percentiles. If your kid scored in the 84% on the PSAT and the 98% on the SAT, which is a one standard deviation jump, I’m willing to bet they are an extreme outlier in showing that much of a change.

    As I wrote, she increased 180 points between her PSAT scores. So the scores are comparable.

    Percentiles are no good for comparison. The higher (and lower) you go, the fewer kids there are, and therefore the larger the increase in points that is required to increase by each percentile. In the middle, between the 50th percentile and the 52nd, there is only a score increase of 10, while at the tail, between the 98th and the top, there is a score increase of 100. So about 2% achieve a PSAT of 940 in 10th grade, while you need to add up all the kids who scored 1320-1520 to reach 2%, since 1320 in in the 98th percentile.

    Yet according to your reasoning, an increase in one's PSAT score from 940 to 950 is comparable to an increase from 1320 to a perfect PSAT score. This is in direct contradiction to actual reality, since increases in score become more difficult, as your scores increase.

    But I don't even need my kid's example. Look at difference in benchmarks for between PSAT scores for 10th and 11th graders. It's about 50 points. So I guess that kids, on average, increase their innate intelligence in one year.

    Furthermore, college AOs at selective colleges discriminate between kids who get SATs of 1400 and those who get 1510, even though it is an increase of only 2 percentiles, and, by your logic, an SAT of 1450 and an SAT of 1510 are identical since they both are in the same percentile.
    edited October 12
    · Reply · Share
Sign In or Register to comment.

Recent Activity