USNWR Best Med Schools 2022

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
I mean frankly, why should they only look at NIH funding? That’s arbitrary as well.

The schools that this helps will be happy about it. The schools it hurts unhappy
I believe most clinical trials and other medical research flows through the NIH, I'm guessing their expansion to "any and all federal dollars" is letting a lot of previously excluded basic science labs get rolled into the med school numbers. A place like Hopkins that is primarily medicine with much less other graduate division presence would drop a lot from such a change.

It doesn't blow your mind that a school ranked in the 30s can catapult to #2, overtaking institutions that have been field leaders for more than a century, essentially overnight due to a methodology change?

It's not coincidence that the PDs of the country rated the big quaternary centers that get referred all the zebra pathology and have the most NIH projects at the top. Unless you think they're just regurgitating the rankings when they express their opinion too

Members don't see this ad.
 
  • Like
Reactions: 1 user
A place like Hopkins that is primarily medicine with much less other graduate division presence would drop a lot from such a change.
wouldn't you expect a massive graduate research institution like UCLA to go up then?
 
Does anyone have access to the PD rankings?
It's screenshotted above. Essentially unchanged (PD ranks barely budge even after big rank swings). For research:

Last year:
#1 UCSF, Hopkins, HMS
#2 Penn, Stanford, WashU
#3 Columbia, Michigan
#4 UCLA, Pitt

This year:
#1 UCSF, Hopkins, HMS
#2 Penn, Stanford, WashU, Michigan
#3 Duke, Columbia
#4 Pitt, NYU, UCLA, Mayo
 
Last edited:
  • Like
Reactions: 5 users
Members don't see this ad :)
wouldn't you expect a massive graduate research institution like UCLA to go up then?
Maybe they will next year. Probably have to learn the new rules to the game, maybe they didn't declare as much as they could have under their SOM on their 2019 paperwork
 
  • Like
Reactions: 1 users
one thing I just noticed, UCLA's stats don't match MSAR. USNWR has UCLA at 3.69 average GPA and 512 average MCAT, while MSAR has it (for matriculants) at 3.81 and 515. I checked a random sample of 3 other schools (UChicago, WashU, UCSF) and they all matched exactly the MSAR matriculation stats, except UCSF GPA which was 3.86 in USNWR and 3.85 in MSAR. this may have something to do with Charles Drew being a separate but related institution, but it looks like USNWR treats it as a separate university that is unranked.

i wonder if there was an error in reporting admissions data, and if that might bump UCLA back in the t20?
 
I have been doing research at a medical school for the past few years and will be matriculating this summer. My PI is an MD and we've had several MD students rotate through our lab in the past years.

It is a pretty hot take but imo, if you want to publish, you'll get to publish, no matter what MD school you go to. There are always more labs looking for free labor than there are medical students. And again, research funding doesn't mean higher productivity. A better barometer would be how many pubs/student each school has. Weighting research funding at 40% is just lazy and imo deceptive.

I will say that higher funding probably attracts more well-known researchers, which could help you. But PD rankings should cover that aspect already.
I mean funding at least has some correlation with the quality of the research going on. There are exceptions but normally junk research isn’t getting 7 and 8 figure grants year after year. Research pubs/student is even more useless than total research funding…have you read some of the crap that gets published? 😬
 
one thing I just noticed, UCLA's stats don't match MSAR. USNWR has UCLA at 3.69 average GPA and 512 average MCAT, while MSAR has it (for matriculants) at 3.81 and 515. I checked a random sample of 3 other schools (UChicago, WashU, UCSF) and they all matched exactly the MSAR matriculation stats, except UCSF GPA which was 3.86 in USNWR and 3.85 in MSAR. this may have something to do with Charles Drew being a separate but related institution, but it looks like USNWR treats it as a separate university that is unranked.

i wonder if there was an error in reporting admissions data, and if that might bump UCLA back in the t20?
They're often based on different years of data. For example they're using 2019 funding paperwork in their 2022 rankings, and I think MCAT/GPA data is often a year behind the MSAR
 
  • Like
Reactions: 1 user
one thing I just noticed, UCLA's stats don't match MSAR. USNWR has UCLA at 3.69 average GPA and 512 average MCAT, while MSAR has it (for matriculants) at 3.81 and 515. I checked a random sample of 3 other schools (UChicago, WashU, UCSF) and they all matched exactly the MSAR matriculation stats, except UCSF GPA which was 3.86 in USNWR and 3.85 in MSAR. this may have something to do with Charles Drew being a separate but related institution, but it looks like USNWR treats it as a separate university that is unranked.

i wonder if there was an error in reporting admissions data, and if that might bump UCLA back in the t20?
USNWR is average, MSAR is median. Median is a more useful number but very rarely reported outside of MSAR. Most schools use averages for MCAT and GPA on their websites :/
 
  • Like
Reactions: 4 users
It's screenshotted above. Essentially unchanged (PD ranks barely budge even after big rank swings). For research:

Last year:
#1 UCSF, Hopkins, HMS
#2 Penn, Stanford, WashU
#3 Columbia, Michigan
#4 UCLA, Pitt

This year:
#1 UCSF, Hopkins, HMS
#2 Penn, Stanford, WashU, Michigan
#3 Duke, Columbia
#4 Pitt, NYU, UCLA, Mayo
I mean outside of the top10 PD rankings lol
 
  • Like
Reactions: 1 users
They're often based on different years of data. For example they're using 2019 funding paperwork in their 2022 rankings, and I think MCAT/GPA data is often a year behind the MSAR
hmmm weird how so many are spot on, but i found another that was off (UNC), so that ruins my theory. i guess it goes to show things don't change much year to year at the top.
 
  • Like
Reactions: 1 users
Let's create our own rankings based on the raw data behind the USNews paywall.
 
  • Hmm
Reactions: 1 user
Members don't see this ad :)
I believe most clinical trials and other medical research flows through the NIH, I'm guessing their expansion to "any and all federal dollars" is letting a lot of previously excluded basic science labs get rolled into the med school numbers. A place like Hopkins that is primarily medicine with much less other graduate division presence would drop a lot from such a change.

It doesn't blow your mind that a school ranked in the 30s can catapult to #2, overtaking institutions that have been field leaders for more than a century, essentially overnight due to a methodology change?

It's not coincidence that the PDs of the country rated the big quaternary centers that get referred all the zebra pathology and have the most NIH projects at the top. Unless you think they're just regurgitating the rankings when they express their opinion too
It really doesn’t blow my mind, because places that are jumping up have also been contributing to the field for decades. Our perceptions just haven’t granted them that benefit because it’s been so quick. There’s really no actual reason why Mayo and NYU can’t be “top 10” - they have amazing hospitals, they have top tier residencies, they have leaders in the fields in numerous specialties, impressive match lists, very high PD ratings and a strong research engine. Just like pretty much all the other schools on the top 30 or so.

I honestly like the shakeup. Because it challenges our old held perceptions of how we “rank” schools, it showcases the volatility and arbitrary nature of the USNWR rankings, and it hurts the egos of those who have historically been on the top and now have fallen, all at the same time.
 
  • Like
Reactions: 8 users
It really doesn’t because places that are jumping up have also been contributing to the field for decades. Our perceptions just haven’t granted them that benefit because it’s been so quick. There’s really no actual reason why Mayo and NYU can’t be “top 10” - they have amazing hospitals, they have top tier residencies, they have leaders in the fields in numerous specialties, impressive match lists and a strong research engine. Just like pretty much all the other schools on the top 30 or so.

I honestly like the shakeup. Because it challenges our old held perceptions of how we “rank” schools, it showcases the volatility and arbitrary nature of the USNWR rankings, and it hurts the egos of those who have historically been on the top and now have fallen, all at the same time.
So next year for the college rankings, if they put Northwestern Vandy WashU as their top handful above Harvard Stanford and Yale...that's a reasonable shake up too? Doesn't quite pass the sniff test to me, and I went to one of the first group
 
  • Like
Reactions: 3 users
So next year for the college rankings, if they put Northwestern Vandy WashU as their top handful above Harvard Stanford and Yale...that's a reasonable shake up too? Doesn't quite pass the sniff test to me, and I went to one of the first group
I haven’t looked at college rankings in a long time and am not aware how they are even done. If they are as arbitrary as the medical school ones, I’ll hear out the case.

But it’s not just a shakeup for a shakeups sake when it comes to the medical school rankings. In my previous post I make the case for the newcomers being deserving. Again, no one can objectively articulate why Mayo or NYU can’t possibly be top 10 schools. Why Yale can’t be 10-20. Why UCLA couldn’t possibly be >20. We think this is crazy because this same magazine has told us something different from 2000-2010. But now we have these schools also high in the PD rankings. You can always make these cases because the differences at the top tier are frankly minimal - in my view you could realistically shuffle the top 20 and it makes no practical difference. The only one I don’t see shifting is Harvard, due to the crazy amount of research funds they have.

To help convince you that I'm not 100% Joker in TDK, I do think Harvard and Hopkins are historically cemented at the top for numerous reasons. But beyond that? It's all fair game, and may the odds be ever in their favor.
 
Last edited:
  • Like
Reactions: 11 users
  • Like
Reactions: 2 users
You know, the only people who care about these types of things are starry-eyed, Prestige intoxicated pre-meds, and medical school Deans.
And certain parents who want to brag that their child goes to a T5 medical school...
 
  • Like
  • Haha
Reactions: 15 users
You know, the only people who care about these types of things are starry-eyed, Prestige intoxicated pre-meds, and medical school Deans.
Well, yes, but those are the two sides of the transactions that are going on here. The buyers (students) and the sellers (Deans).
 
  • Like
Reactions: 1 user
Well, yes, but those are the two sides of the transactions that are going on here. The buyers (students) and the sellers (Deans).
in theory you're buying for resale value, so you also have to think about who you're going to sell to later (Program Directors)
 
  • Like
Reactions: 1 users
You know, the only people who care about these types of things are starry-eyed, Prestige intoxicated pre-meds, and medical school Deans.
hence the disclaimer
 
  • Like
Reactions: 1 users
I haven’t looked at college rankings in a long time and am not aware how they are even done. If they are as arbitrary as the medical school ones, I’ll hear out the case.

But it’s not just a shakeup for a shakeups sake when it comes to the medical school rankings. In my previous post I make the case for the newcomers being deserving. Again, no one can objectively articulate why Mayo or NYU can’t possibly be top 10 schools. Why Yale can’t be 10-20. Why UCLA couldn’t possibly be >20. We think this is crazy because this same magazine has told us something different from 2000-2010. But now we have these schools also high in the PD rankings. You can always make these cases because the differences at the top tier are frankly minimal - in my view you could realistically shuffle the top 20 and it makes no practical difference. The only one I don’t see shifting is Harvard, due to the crazy amount of research funds they have.

To help convince you that I'm not 100% Joker in TDK, I do think Harvard and Hopkins are historically cemented at the top for numerous reasons. But beyond that? It's all fair game, and may the odds be ever in their favor.
An argument could be made based on the affiliated hospitals and their training programs, that's really what the clinical years will reflect. When you survey senior med school faculty about the strongest residency programs in a bunch of specialties, its:

Anesthesia: Hopkins, Harvard, UCSF (NYU #9)
Internal med: UCSF, Harvard, Hopkins (NYU #10)
OBGYN: UCSF, Harvard, Penn (NYU rank not published)
Pediatrics: Penn, Harvard, Cincinnati (NYU rank not published)
Psychiatry: Yale, Harvard, Hopkins, Penn (NYU unranked)
Radiology: Hopkins, Harvard, UCSF (NYU rank not published)
Surgery: Hopkins, Duke, Harvard (NYU rank not published)

You can see how it would be weird for places like Hopkins and Penn to be 7-9th while NYU is 2nd.
 
  • Like
Reactions: 5 users
An argument could be made based on the affiliated hospitals and their training programs, that's really what the clinical years will reflect. When you survey senior med school faculty about the strongest residency programs in a bunch of specialties, its:

Anesthesia: Hopkins, Harvard, UCSF (NYU #9)
Internal med: UCSF, Harvard, Hopkins (NYU #10)
OBGYN: UCSF, Harvard, Penn (NYU rank not published)
Pediatrics: Penn, Harvard, Cincinnati (NYU rank not published)
Psychiatry: Yale, Harvard, Hopkins, Penn (NYU unranked)
Radiology: Hopkins, Harvard, UCSF (NYU rank not published)
Surgery: Hopkins, Duke, Harvard (NYU rank not published)

You can see how it would be weird for places like Hopkins and Penn to be 7-9th while NYU is 2nd.

I don't really know what this list is based off of or why it's only a few specialties.


Either way, you and I both know to look at the USNWR rankings granularly and not based on general tiers is a fools errand.
 
I don't really know what this list is based off of or why it's only a few specialties.


Either way, you and I both know to look at the USNWR rankings granularly and not based on general tiers is a fools errand.
It's their own specialty rankings, based on a survey of senior faculty asking them to name the best places to train. Basically the same methodology as Doximity except they ask faculty instead of site users. Those were the specialties they surveyed (plus FM which is a completely different group of names more aligned with primary care ranks).

Unfortunately I bet these rankings do impact school choices for a lot of people. I doubt schools would care as much as they do otherwise.
 
Let's create our own rankings based on the raw data behind the USNews paywall.


As I said last year...
  1. Like last year, I have assigned equal weight to research and primary care rankings and simply added them together to make the total score.
  2. Every school on USNWR is covered, including those that have "Ranking Not Published (RNP)" designations.
  3. Though the residency director ratings may be better than the aggregate USNWR rankings, there are reasons to be skeptical. How are program directors polled on these rankings? It is hard to imagine many PDs sitting down and forming a rank list of 185 institutions with any kind of significant reproducibility or resolution. More transparency in the methodology of these ratings would be useful.
  4. Please let me know if there are typos or other errors and I will fix them ASAP.
  5. I added step 1 and step 2 scores when they were available (bold means they're updated)
also for the record, I stand by what I said on SDN - While speculation about this may yield interesting results, I'd like to remind people that USNWR rankings have faced very legitimate criticisms and this study (Gollehon NS, Stansfield RB, Gruppen LD, et al. Assessing Residents' Competency at Baseline: How Much Does the Medical School Matter?. J Grad Med Educ. 2017;9(5):616–621. doi:10.4300/JGME-D-17-00024.1) showed that "Our results suggest that residents' medical school of origin is weakly correlated with clinical competency as measured by a standardized OSCE."
 
  • Like
Reactions: 10 users
Wow! I hope my school ranked high for pathology. They didn’t list it. Here I am, 69 yo and retired and just HOW the heck am I supposed to know what kind of education I got.
I hope it ranks well.
 
  • Haha
  • Like
Reactions: 16 users
Wow! I hope my school ranked high for pathology. They didn’t list it. Here I am, 69 yo and retired and just HOW the heck am I supposed to know what kind of education I got.
I hope it ranks well.
This is definitely my favorite comment lol
 
  • Like
Reactions: 1 user
yeah sorry, I had to manually enter all of the remaining data... which took 2 hours
no need to apologize at all!! i should be the one thanking you for putting this together, i was hesitant to ask initially because i know it's a lot of work :/ so thank you so much!!
 
  • Like
Reactions: 1 user


As I said last year...
  1. Like last year, I have assigned equal weight to research and primary care rankings and simply added them together to make the total score.
  2. Every school on USNWR is covered, including those that have "Ranking Not Published (RNP)" designations.
  3. Though the residency director ratings may be better than the aggregate USNWR rankings, there are reasons to be skeptical. How are program directors polled on these rankings? It is hard to imagine many PDs sitting down and forming a rank list of 185 institutions with any kind of significant reproducibility or resolution. More transparency in the methodology of these ratings would be useful.
  4. Please let me know if there are typos or other errors and I will fix them ASAP.
  5. I added step 1 and step 2 scores when they were available (bold means they're updated)
also for the record, I stand by what I said on SDN - While speculation about this may yield interesting results, I'd like to remind people that USNWR rankings have faced very legitimate criticisms and this study (Gollehon NS, Stansfield RB, Gruppen LD, et al. Assessing Residents' Competency at Baseline: How Much Does the Medical School Matter?. J Grad Med Educ. 2017;9(5):616–621. doi:10.4300/JGME-D-17-00024.1) showed that "Our results suggest that residents' medical school of origin is weakly correlated with clinical competency as measured by a standardized OSCE."

This is much better, back to T5 bragging :)
 


As I said last year...
  1. Like last year, I have assigned equal weight to research and primary care rankings and simply added them together to make the total score.
  2. Every school on USNWR is covered, including those that have "Ranking Not Published (RNP)" designations.
  3. Though the residency director ratings may be better than the aggregate USNWR rankings, there are reasons to be skeptical. How are program directors polled on these rankings? It is hard to imagine many PDs sitting down and forming a rank list of 185 institutions with any kind of significant reproducibility or resolution. More transparency in the methodology of these ratings would be useful.
  4. Please let me know if there are typos or other errors and I will fix them ASAP.
  5. I added step 1 and step 2 scores when they were available (bold means they're updated)
also for the record, I stand by what I said on SDN - While speculation about this may yield interesting results, I'd like to remind people that USNWR rankings have faced very legitimate criticisms and this study (Gollehon NS, Stansfield RB, Gruppen LD, et al. Assessing Residents' Competency at Baseline: How Much Does the Medical School Matter?. J Grad Med Educ. 2017;9(5):616–621. doi:10.4300/JGME-D-17-00024.1) showed that "Our results suggest that residents' medical school of origin is weakly correlated with clinical competency as measured by a standardized OSCE."

Question, why do you equally rate research and primary PD scores? Certainly for competitive specialties (which, lets be honest, is the only reason you would care about "rankings" anyways), PDs aren't going to care about the primary care score, because competitive programs just simply aren't primary care specialties.
 
  • Like
Reactions: 1 user
Question, why do you equally rate research and primary PD scores? Certainly for competitive specialties (which, lets be honest, is the only reason you would care about "rankings" anyways), PDs aren't going to care about the primary care score, because competitive programs just simply aren't primary care specialties.
I'm curious about this too, not sure the reason for combining them when the ranking being shown in comparison is the research ranking. Adding them together falsely inflates schools with high PD rankings for primary care.
 
  • Like
Reactions: 2 users
what exactly is primary care ranking? Seems like public schools seems to doing much better, is it because of the patient population or their focus on teaching vs research?
 
what exactly is primary care ranking? Seems like public schools seems to doing much better, is it because of the patient population or their focus on teaching vs research?
you can see the methodology here. basically it weighs admissions statistics and research less and more heavily weighs how many students from each school end up in primary care and the rankings of primary care PD's
 
  • Like
Reactions: 2 users
I wonder how they know whether an IM match is primary care or gonna specialize. Same for peds
 
  • Like
Reactions: 3 users
I wonder how they know whether an IM match is primary care or gonna specialize. Same for peds
Primary Care Production
Primary care production is used in the primary care ranking model only. Its two indicators were weighted in total at 40%.
Medical school graduates practicing in primary care specialties (0.30): This new indicator measures the proportion of a medical school's 2012-2014 graduates who are practicing in a primary care specialty as of 2020. It's a fuller measure of a schools' imprint in primary care than exclusively assessing the proportions of graduates in primary care residencies. U.S. News worked with the Robert Graham Center – a division of the American Academy of Family Physicians – as the data provider.
Medical school graduates into primary care residencies (0.10, previously 0.30): The percentages of a school's M.D. or D.O. graduates entering primary care residencies in the fields of family practice, pediatrics and internal medicine were averaged over 2018, 2019 and 2020.
 
I wonder how they know whether an IM match is primary care or gonna specialize. Same for peds
it looks like there was an additional metric that evaluates how many graduates end up working in primary care, rather than just going to "primary care residencies" that could be academic IM -> cardiology and the like
 
Next years rankings:
1. NYU
2. Harvard
2. Stanford
2. UCSF
2. Columbia
2. Johns Hopkins
2. Mayo
2. WashU
2. Duke
2. Yale
11. Double-digit riff raff
i know is a joke but a great opp to post this classic (even though the methodology has changed several times from when it was written):

 
  • Like
Reactions: 4 users
hmmm weird how so many are spot on, but i found another that was off (UNC), so that ruins my theory. i guess it goes to show things don't change much year to year at the top.
It could also be that some schools have updated MSAR data while others don't. I am pretty sure MSAR is continuously updated whereas USNWR is static once published.
 
An argument could be made based on the affiliated hospitals and their training programs, that's really what the clinical years will reflect. When you survey senior med school faculty about the strongest residency programs in a bunch of specialties, its:

Anesthesia: Hopkins, Harvard, UCSF (NYU #9)
Internal med: UCSF, Harvard, Hopkins (NYU #10)
OBGYN: UCSF, Harvard, Penn (NYU rank not published)
Pediatrics: Penn, Harvard, Cincinnati (NYU rank not published)
Psychiatry: Yale, Harvard, Hopkins, Penn (NYU unranked)
Radiology: Hopkins, Harvard, UCSF (NYU rank not published)
Surgery: Hopkins, Duke, Harvard (NYU rank not published)

You can see how it would be weird for places like Hopkins and Penn to be 7-9th while NYU is 2nd.
I take it you go to Harvard, Hopkins, or UCSF.
Checked, looks like you go (or went) to Hopkins
Sorry. but I gotta call you out
In the following competitive fields, Doximity Reputation rankings:

DERM NYU 3, JHU 17
INT RADIOLOGY NYU 4, JHU 23
ORTHO NYU 7, JHU 19
PLASTIC SURGERY NYU 2, JHU 5
THORACIC SURGERY NYU 11, JHU NR


What I find weird is why someone would pay 60 thousand a year in tuition to go to Hopkins (or any other med school for that matter), when NYU is free
 
  • Like
Reactions: 7 users
I take it you go to Harvard, Hopkins, or UCSF.
Checked, looks like you go (or went) to Hopkins
Sorry. but I gotta call you out
In the following competitive fields, Doximity Reputation rankings:

DERM NYU 3, JHU 17
INT RADIOLOGY NYU 4, JHU 23
ORTHO NYU 7, JHU 19
PLASTIC SURGERY NYU 2, JHU 5
THORACIC SURGERY NYU 11, JHU NR


What I find weird is why someone would pay 60 thousand a year in tuition to go to Hopkins (or any other med school for that matter), when NYU is free
Int rads and thoracic are only sortable by size

It's cool that NYU is stronger in ortho, derm and plastics but you can't tell me with a straight face that overall the residencies at the two are similar, historically or now
 
  • Like
Reactions: 1 users


As I said last year...
  1. Like last year, I have assigned equal weight to research and primary care rankings and simply added them together to make the total score.
  2. Every school on USNWR is covered, including those that have "Ranking Not Published (RNP)" designations.
  3. Though the residency director ratings may be better than the aggregate USNWR rankings, there are reasons to be skeptical. How are program directors polled on these rankings? It is hard to imagine many PDs sitting down and forming a rank list of 185 institutions with any kind of significant reproducibility or resolution. More transparency in the methodology of these ratings would be useful.
  4. Please let me know if there are typos or other errors and I will fix them ASAP.
  5. I added step 1 and step 2 scores when they were available (bold means they're updated)
also for the record, I stand by what I said on SDN - While speculation about this may yield interesting results, I'd like to remind people that USNWR rankings have faced very legitimate criticisms and this study (Gollehon NS, Stansfield RB, Gruppen LD, et al. Assessing Residents' Competency at Baseline: How Much Does the Medical School Matter?. J Grad Med Educ. 2017;9(5):616–621. doi:10.4300/JGME-D-17-00024.1) showed that "Our results suggest that residents' medical school of origin is weakly correlated with clinical competency as measured by a standardized OSCE."

Does anyone know why Mt. Sinai is ranked significantly lower by PDs (#38) than by USNWR (#17)?
 
Thanks for doing this. Just a quick question for you or anyone that knows. Are avg Steps scores available for DO schools? I know they take comlex, but i heard that up to 60% of them take steps. Do the DO schools maintain this data, but will not release it because they are comlex centric?
Most of them do not publish them. RVU mandates their students take Step and their average is usually around 223 or something like that. My school only has about 30/110 people take it (lots of home programs people want to go to so don’t meet it) and our average my year was 225.
 
  • Like
Reactions: 1 users
NYU's meteorite ascent is just insane.

It's not at all surprising. They have full scholarships - they can get whatever top tier students they want with damn-near-100% yield. A large part of the ranking is caliber of students (which has to be sky-high) and acceptance rate (which has to be rock bottom - due to the high yield and how many people want a full scholarship to a NY med school).
 
  • Like
Reactions: 5 users
It's not at all surprising. They have full scholarships - they can get whatever top tier students they want with damn-near-100% yield. A large part of the ranking is caliber of students (which has to be sky-high) and acceptance rate (which has to be rock bottom - due to the high yield and how many people want a full scholarship to a NY med school).
Your points are true, but a bit overstated. Their yield is sky high, but it's around 65% (which is double what it was pre-2019), which is nowhere close to 100%. You need to take into account that PLENTY of top students either have significant financial need, which translates to big need based grants at peer schools, are eligible for significant merit scholarships elsewhere, or come from families where $300K is meaningless, so everyone isn't motivated by free tuition, and they actually have a lower yield than Harvard, although it still is one of the highest in the country. Just not "damn near 100%."
 
  • Like
Reactions: 4 users
Top