https://www.usnews.com/education/best-high-schools/articles/how-us-news-calculated-the-rankings
Essentially, high schools were rewarded for the proportion of students who attempted a single AP class or IB class, and for the proportion who were able to score a 3 on an AP or a 4 in an IB class. Schools were rewarded for “underserved” population coming close to the average for non-underserved students. And so on. No look at PSAT/SAT/ACT scores. No look at the percentage able to achieve a score of 5 on multiple AP exams. Basically, the rankings reward schools for meeting easily-gamed criteria designed to bring the lower rank of the student population closer to the middle of the pack. For example, AP Psych is considered to be a relatively easy AP class, so schools looking to game the ranking criteria might run many sections of AP Psych, push borderline students into them, and get credit for a large proportion of their students at least attempting a single AP class, even if they never take the exam, or score below 3 on the exam if they do take it. Credit is also given for the percentage of students who manage to earn a high school diploma, so tweaking the graduation criteria to try to nudge that F student into the D minus range, so that he can be given a diploma, moves a school up in the ranks.
It is a laudable goal for struggling students to earn a high school diploma, and for low achieving students to attempt at least one AP class. But generally, that is not an indicator of high achievement, but of “equitable” achievement, whatever that means. But I most certainly would not consider these rankings any indication whatsoever of the level of academic excellence of any particular high school. I’d still look at SAT/ACT scores, and at the number of AP classes offered, and the proportion of 4’s and 5’s earned on the AP exams, and at college acceptances, if that data is available (keeping in mind the presence of “hooks” in the student population).