Problems and limitations with College career outcome reports: reading between the lines

With a large enough data set, stratified appropriately (e.g. by major), some useful information can be discerned with less likelihood of being misled by outliers. Entire school medians and means are less useful, because then you have to consider the mix of majors.

1 Like

This isn’t about college scorecard. It’s about the limits of the data of many of these metrics. I have no opinion about any particular metric. But your comment that “some useful data can be discerned” is about right, in its limited format, in some cases. The data can be particularly limited and skewed if it comes from small schools. A large data set is necessary to minimize the influence of “the outliers”.

2 Likes

@kelsmom is this information that can be released to a third party like College Scorecard without direct consent from the student.

College Scorecard is from the federal government. For the earnings information, it used SSNs from students who received federal aid while in school. Earnings information for students with records in the National Student Loan Database System (NSLDS) is pulled directly from the IRS. Schools report major for students with loans, so that information is available in NSLDS. The government (was Department of Education) aggregates the information by school and publishes it. The limitations of the College Scorecard include the fact that only students who received federal aid are represented. Here is a document that explains the College Scorecard from a technical standpoint: https://collegescorecard.ed.gov/assets/InstitutionDataDocumentation.pdf.

5 Likes

Thanks. Very helpful. Thats an important limitation.

It’s actually stated in the College Scorecard document I linked to. It’s a valid consideration.

2 Likes

Noting also that (AFAIK) schools are not using the collegescorecard data to populate their First Destination type reports, which I think is the focus of @jym626’s OP.

The First Destination Reports are populated by those who choose to complete their school’s survey in the months after graduation.

Small nit: DOE = Dept of Energy. ED = Dept of Education. Although ED may not exist much longer, pending Congressional action.

2 Likes

To clarify, any consent for IRS data on the FAFSA is one-time only, for that specific purpose.

1 Like

That’s a big limitation.

And how far after college graduation is this information accessible…

1 Like

But it would not have excluded Teach for America, Americorps or Peace Corps. It would not have excluded someone taking a glide year while preparing for medical school applications…and working in a lower paying job. Same with law school applicants.

It wouldn’t include kids like mine who had an undergrad engineering major and never worked or will work in that field.

IOW…the data is incomplete at best.

3 Likes

Over half of college students receive some type of federal financial aid. One does have to take a view as to whether they think those who receive any type of federal financial aid (including those who took only student loans) have significantly different salary outcomes on average. CC’ers have debated this on a mind numbing number of other threads.

Collegescorecard data has salary data for five years post-grad. But there is no limit to the timeframe they can pull that data AFAIK. The bigger risk now is whether or not the Feds will continue to produce said data, it is required by law, but we shall see.

2 Likes

It’s interesting that college scorecard pulls the data from students, and only from students, who used and took federal loans during college, but it doesn’t include students who get government grants in graduate school. Regardless, as was accurately pointed out above, the gist of this thread is to address the myriad of limitations of college career outcome reports, and to heed a warning to applicants that this data can be skewed and restricted to the degree of being not only extremely limited, but perhaps to the point in some cases, where the number reporting is very small, of being almost potentially useless.

Some schools do have more information than others, in more searchable/accessible data bases, and are more transparent.

Obviously I know the most at this point about Purdue, which is now up to a 90% completion rate for their “next steps” survey. They career center actually does phone follow ups to new grads if they don’t complete the form on line. It’s one of the highest participation rates in the country.

The data base has information about salary, top employers, employment by state, graduate school by state, top industries, professional outcomes (pharma/vet), and internship data. They provide 5 year trends as well. It’s all searchable down to college and specific major. At least for my D’s major, the salary information is consistent with what is published by her national professional association.

The “industry” information was what my D focused on when trying to create her list. For example, UT and UDel are highly ranked in chem e but both schools had strong industry ties outside her areas of interest. Again, for a kid who is pretty sure they know what they want to study that kind of information can be helpful.

For me, the career outcome reports can just be one more piece of the puzzle. Heck, my D even looked up niche rankings when trying to figure out her list.

2 Likes

That’s nice to hear that Purdue does that. But do most colleges do that and do they report what percentage of their graduates responded? That’s the point of the original post, that one has to look at that piece of the data to see if there is really enough reporting to provide useful information. The article I linked points out that the “knowledge base” can be well below 50%, which makes the information provided limited in its utility.

“ According to the organization’s most recent annual report, which reflects data collected from the class of 2023, the national knowledge rate for bachelor’s degree students was only 56%. That means that the national career success average of 85% reported by the class of 2023 represents just over half of all graduates.

“That means that when it comes to the career success outcomes touted by any given college or university – even those that make annual “best of” rankings – it’s safe to assume that the numbers reflect only 56% of the institution’s students. Say a college reports a 90% career success rate of its 1,000 graduates, then we would expect 900 of those students to be employed or pursuing a further degree. But if the knowledge rate for that institution is 50%, then the college is claiming a very high level of overall success while having data for only half of its graduating class.”

To add, key points from that article that point to the limitations of this metric without knowing how many/what percentage of graduates responded:

“ For example, if you look at some of the colleges on The Wall Street Journal’s best value list, you can see some low knowledge rates. Baruch College earned the top spot for best value, and the school’s success rate for the class of 2022 was 96%. However, its knowledge rate was only 49%.

Similarly, Bowdoin College reported for its class of 2022 a 92% career success rate, with a response rate of 35%.. “

I will be honest that I haven’t looked at these reports particularly closely because in my industry they often consolidate job titles, roles and market segments into one heading that makes the potential takeaways almost meaningless or even worse misleading.

I have also noted that some of the sights use ambiguous descriptors for a variety of relevant info. For example saying that students found their jobs “on line” and including LinkedIn fails to recognize that LinkedIn is the primary way to access alumni networks.

I don’t think such info is a total waste but needs to be contextualized and considered as just one very small piece of a much broader puzzle. I do think it highly ill advised and a bit silly to make a life long decision based on a $10k difference in published outcomes given its art vs science foundations, particularly from those that most loudly argue that it’s the kid that drives the result.

6 Likes

This is true, well said, and makes so much sense.

Many schools do put the response rate on their report. Bowdoin states that their average response rate is 30%, I personally wouldn’t put much credence into any data with 70% of it missing. Like anything, caveat emptor.

Alumni Survey (annual)

In 2012, Bowdoin began surveying alumni who had graduated from Bowdoin one, five and ten years earlier. The surveys collect information about post-Bowdoin careers and graduate school. In addition, alumni are asked about skills and abilities, balance and priorities in their lives, and their opinions of Bowdoin. Beginning in fall 2018, only graduates who are one year out of Bowdoin will be surveyed annually. Graduates who are five and ten years out of Bowdoin will be surveyed periodically. Recent survey response rates have been about 30%.

And 30% is exactly the response rate they received for the Class of 2024 report:

3 Likes

Response rate would be one more thing for students/families to do their due diligence about. I remember reading back in the day that UMich had a sub 40% response rate for their survey. As such, we didn’t give much weight to their report. And, if families can’t find the information by search engine, it’s a good question to ask the career office at the schools. If they don’t have the info, that would be a red flag to me about the value of the report.

IMO, these reports can provide another piece of information when looking at all the other factors that go into finding the right fit. Certainly not the be it and end all, but also doesn’t need to be totally dismissed, especially at some schools that compile a robust amount of information.

2 Likes

Yes, exactly! Thank you for reiterating that. That’s the point of this entire thread! That if people don’t look at that fact, they can be misled by the data. The article pointed out that Baruch had the number one spot in the US news report, but its data was clearly very limited. I guess it’s a good way to sorta “lie with statistics“. It would be great if every school did what Purdue did! But they don’t, unfortunately. :frowning:
“Baruch College earned the top spot for best value, and the school’s success rate for the class of 2022 was 96%. However, its knowledge rate was only 49%.”

4 Likes

Yes! This! Exactly!

And to add to Catcher’s fine post- just for context for the folks who aren’t close to the Financial Services Industry-

Analyst. A very common “early career” title. It might mean someone who crunches the numbers to determine if Goldman Sachs should modify its parental leave policy. It might mean the person who is part of the origination team at Lazard. It might mean a member of one of many investment teams at Citadel, or it might be the person who figures out if AQR should move some of its back office functions out of Greenwich CT and how much money that would save per year. Or the person who monitors the event budget at DE Shaw and figures out if they should change up their corporate preferred vendor list for next year.

Associate. Another common early career title. Same spread- it can apply to folks who are in support functions (HR, Operations, Real Estate/facilities, finance (not client facing finance, but internal finance, i.e. tax, overhead line items, etc.), Communications/PR,.

Some of these are the high-flying roles that kids think they want- and are able to persuade their parents that it’s worth taking out a second mortgage or spending down the retirement funds because “I’ll make it back in two years”. Some of these are well paying, professional jobs which are only a skootch better paid than their functional counterparts at a snack food company.

Most open-source type databases and reports do not make the distinction. Companies themselves (and industry associations) conduct their own salary benchmark studies, but these are proprietary.

Use the data with caution. In some fields, the spread is narrow, so the potential to make poor decisions based on bad data is low risk. Yes, a special ed teacher on Long Island makes a lot more money than a special ed teacher in Tulsa. But since the data from a report can be verified by other sources, it isn’t likely that parents are making college decisions based on “Hey, look-- special ed is a really high paying profession”. In other fields, the spread is enormous- hedge funds, Investment banks, as noted above.

Linkedin- be very, very careful when trying to extrapolate useful data from the self-reported and unverified information on someone’s profile. My company verifies employment before extending an offer-- as in making sure that someone who claims to be an Associate at a leading investment bank is actually an “Associate”. Frequently, that person was a 10 week intern “working closely” with a team of Associates. Interns are not Associates! Or, someone was working at a temp agency and was assigned to work on a tech conversion of the global payment system (employees in London get paid in pounds. employees in NY get paid in dollars. employees in Paris get paid in euros. Global companies have payroll systems which figure it all out and make sure the right taxes are withheld depending on the location). Yes, there was currency involved. But you were a temp, not an Associate. And you were working on a software conversion, NOT working on the “Global Currency Team”.

Just a clarification. Carry on.

5 Likes