Problems and limitations with College career outcome reports: reading between the lines

I think engineering schools are particularly good at this.

RIT and WPI for example, has very clear info (including knowledge rate / # missing) for each degree too. It also includes the median AND average salaries..

e.g. Specific one’s from RIT

Link to all

WPI dashboard:
https://public.tableau.com/app/profile/wpi.institutional.research/viz/FirstDestinationSurveyDashboard/NEW-PublicFDSOutcomesReport

2 Likes

And to add, we reviewed this sort of information, but only as one small piece in the puzzle. We realize it can change a lot year to year based on economy, salaries are impacted by location, etc. And, as you point out, knowledge rate varies a lot.

1 Like

We didn’t really look at this information, but I can understand why some people do. With college being so expensive, many parents want to direct their kids to schools and/or majors that they think will lead to good paying jobs. Of course, many schools don’t have very robust information so you need really take it all with a grain of salt.

4 Likes

This. As we used to say about some data analyses, garbage in, garbage out.

2 Likes

If only the labor markets behaved in the way that people think they should. If they did, there would be no lawyers- passed the bar, worked on a prestigious journal or two- doing document review for a temp agency (for a role which only requires a BA, not a JD, let alone Bar Admission). There would be no CS grads working as “Geniuses” at the Apple Store, because they figured better to put Apple on their resume than “Staffed the help desk at an insurance company claim center”. There would be no accounting grads working at H&R Block doing tax returns for people who earn $70K per year, and no finance grads doing their third post-grad internship in wealth management (each lasting 10 weeks, no benefits) desperate to get a full time job- any job- that relates to finance.

But markets are tricky things. Supply, Demand, Externalities- all that jazz. And the occasional Black Swan.

5 Likes

I agree with you 100%. That is why it is important for kids to have a flexible mindset, a willingness to pivot when necessary and the ability to think outside the box.

3 Likes

Here is another good resource to look at to explore schools’ “ROI”. Not as potentially skewed as some individual college self reports:

And another, from Georgetown center for education in the workforce

Note that both the FREOPP and CEW Georgetown analyses use earnings data from College Scorecard.

Also, calculating ROI (return on investment) depends on the investment as well as the return (earnings). Given the realities of college pricing, financial aid, and scholarships, the investment that a student makes can be very different from the investment made by some other student at the same college in the same major. Hence, even if the students end up with the same earnings after college, they can have very different ROIs.

2 Likes

All good points. Thats even more so why discerning “value” is so tricky. And why “outcome” is only a piece of the puzzle.

1 Like

In my opinion, measuring the numerator (outcome) is the more problematic part of ROI than measuring the denominator (cost). If 2 students have different earnings after college, there is no reason to assume the college name is the primary reason for the earnings difference. More influential are differences between the students themselves – different majors, different career goals/paths, different backgrounds/connections/SES, different abilities, different motivations, etc. To have a reasonable estimate of ROI from attending a particular college, you need to have a decent control for these types of individual student differences at different colleges.

You can’t just assume if you take the average MIT student and put him in a directional public he/she will suddenly become the average student at that directional public who has a >50% chance of failing to graduate. Similarly you can’t assume that if you take the average directional public student and put him/her at MIT, he/she suddenly becomes the average MIT student who is likely to excel in classes, graduate on time, and receive many stellar job offers.

As an example, the latter ROI ranking shows the following colleges had the highest 20-year ROI. Major distribution and selectivity (differences in average student at college) clearly have an influence. For example, if a prospective ECE/EECS student was considering Olin, but chooses to attend UC Berkeley instead, does he have a huge expected ROI loss, as suggested by table? Or are those difference in outcome averages expected to be near eliminated with controls for things like majors/career goals and student ability?

  1. MIT
  2. St. Louis Pharmacy
  3. Harvey Mudd
  4. Albany Pharmacy
  5. Caltech
  6. Olin Engineering
  7. Fuld Nursing
  8. MC Pharmacy
3 Likes

Excellent points and very well said! Which further demonstrates the problems trying to use either of these measures as they both have inherent flaws as beautifully described and exemplified above. So when students are making decisions about their colleges, looking at what they may or may not earn down the road based on who may or may not have reported anything And confounded further by how much they may or may not have spent on their education it all becomes a rather muddy, somewhat useless piece of information. Just because someone reports this stuff doesn’t make it meaningful. That’s the point. And just because someone calls it “data” doesn’t mean it’s meaningful.

2 Likes

Jeff Salingo’s last webinar addressed using first destination surveys as one of the many tools in making a college decision: WEBINAR RECORDING: Hear from Jeff Selingo "How to Balance Fit, Cost, and Prestige when Choosing Your School as a HS Senior or HS Junior" - #8

5 Likes

I understand and respect the perspective of both those suggesting the use and relevance of outcomes reports and those cautioning against their use and highlighting these types of reports shortcomings.

My question for those that find these reports enlightening is, shouldn’t you then be willing to pay up for schools that report greater successes such as higher starting salaries while acknowledging that not all schools produce the same results. Stated differently, if career outcome pages are to be used as a differentiator amongst schools, doesn’t that mean there are meaningful differences that should be considered and possibly paid up for regardless of individual anecdotal exceptions or narratives.

3 Likes

We definitely looked at the data. For engineering outcomes, things are fairly egalitarian for first job salaries and are more based on location differences than anything else. But, what was more interesting/helpful to us were the first employers and industry ties that are usually also included in those same reports. Even amongst highly regarded programs, there were big differences for my D’s intended major.

And yes, we did pay more for D to go where she thought was best (she had a full tuition scholarship she turned down). We have an only child, had a healthy 529, and had budgeted for full private school tuition. I realize the privilege of being able to take $ out of the equation.

6 Likes

Very helpful post!

Back in the day, we used to use a reference called Rugg’s Recommendations, which listed (I am paraphrasing) recommended schools (by selectivity) for majors. Fred Rugg updated his book yearly until 2018 (35th edition), adding new majors that hadn’t been previously listed. (I still have the 34th edition on my computer!). It also (again, back then) listed expected standardized test scores by major. Those were additional pieces of data that helped with the college search. We did not look at outcome data in part b/c it wasn’t a driving factor in either kid’s decision. Starting salaries wasn’t their driving issue, and while yes, where kids ended up working after college was interesting, not uncommonly the top place on many schools’ lists is working back at that very college!

Both my kids changed majors, so even if we had taken outcome data, even understanding its potential weaknesses, into consideration, the info would have mattered little if they had made their college choices based in part on job placement from their initial intended major. So outcome data based on career starts or companies employed would have ultimately been meaningless for them.

Whats even more interesting is that my kids went to different schools, started with different majors, both then ended up pivoting to engineering (though different types), started in very different jobs in different cities, and through several different opportunities and job changes over the years, ended up working for the same company and now even have the same job title!!

Point being, there are lots of variables to look at when choosing colleges to apply to. We were full pay (except for the NMS) for one kid, and the other kid had a full tuition scholarship (plus NMS). So the cost difference to us was notable, but they were both happy at their schools, have done well, and we are all thrilled with the outcome. And if they had looked at outcome reports, even with the potential limits of those self reports, and especially limiting in small schools and/or with a low response rate, they might have looked at other schools that weren’t the radar, or eliminated some that were. Bottom line, WRT outcome reports, they can add some information to the overall research, but caveat emptor.

eta:
Hopefully it’s clear- we paid very different amounts for each kid’s college tuition. Their first jobs were very different (each chose different types of jobs based on their specific area of engineering) but regardless of placement data, regardless of first job salaries (they were, IIRC, pretty similar starting salaries, adjusting for the fact that one was a few years later than the other), they ended up in the same place with the same job title (I don’t know their current salaries. Not my business).

1 Like

I feel like the problem with the discourse on this topic is that as this thread explains, there are many problems with taking summarized survey data like this at face value, and comparing one school to another may have “apples vs. oranges” problems in multiple ways. It’s just like evaluating research studies to see whether they prove what they claim to prove… or like the similar ongoing debate about the way colleges tout their med school acceptance rates.

But the way discussions work on CC, we’re kind of starting from scratch on each thread, and it derails the conversation about a particular OP’s situation if we try to take the necessary deep dive into what the survey reports may or may not mean. So, do we let a post referencing outcome studies stand, or do we get into a whole side debate about it? This thread represents a very helpful alternative - having the discussion, but taking it outside the confines of a specific thread. One hopes that we could all learn from the discussion, and that it would move the needle in terms of how we present the available data to individual posters looking for help with decisions… although I can’t say the needle seems to have moved a whole lot so far.

I actually think there are quite a few topics that would merit “reference threads” that we can refer to rather than reinventing the wheel on individual threads over and over again. And I think it would be great for more of us to create such threads on topics of interest - maybe there could be a particular forum or other structure for this, so that people can “blog” about various subjects and we could do a deep dive on their implications, and then refer to those discussions when advising individual students and parents.

At any rate… there’s a reason why the “lies, damned lies, and statistics” quotation never gets old. We all believe in the value of data, but data needs interpretation, and often there are significant gaps in the contextual information we need to be sure our interpretation is sound. And often that kind of analytical rigor belongs somewhere other than some kid’s thread about where they should attend and why. I feel like it’s worth having a “meta” discussion about how best to break these things down in a way that makes them useful to those seeking advice, and that enhances rather than derails their individual threads.

6 Likes

Another issue…in some cases, by the time this info is published, it isn’t really current events.

And things DO change. In four years, there will be careers that haven’t even been invented yet. And some hot careers won’t be (ask those petroleum engineers from years ago…).

4 Likes

This is an excellent idea - to pin “reference posts” (and kudos for a super excellent overall post summarizing the problems with “data” presented at face value and the same recommendation repeated in multiple threads).

The other irony with the use of outcome reports is that on the one hand students may be encouraged to consider college choices based in part on the reported salaries their graduates earn in a particular field, but it is then also noted in other posts that students from different schools may end up making essentially the same salaries anyway. So why bother to look at salary reports when it may all be moot and salaries may be parallel regardless of where a person attended college? My kids are but one example, and others have been reported as well.

3 Likes

Carnegie has a new classification, Student Access and Earnings. I went through the schools on the list, and I am not impressed. I am very familiar with graduates from one particular niche school that is on the list. The grads I know borrowed a lot of money, including private loans - and they didn’t get well-paying jobs after graduation. However … there are some programs within this school whose grads get great jobs and do very well financially. Like any type of college ranking, you have to look beneath the surface to really understand.

5 Likes