2024 USNWR Medical School Rankings

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
Status
Not open for further replies.

ItAintEasyBeingCheesy

big cheese
2+ Year Member
Joined
Mar 3, 2020
Messages
114
Reaction score
197
Sneak preview of this years rankings are out. Looks like US News intends to include withdrawn schools moving forward using past and publicly available information. What are people's thoughts?


Members don't see this ad.
 
I've stopped paying attention to these rankings. They mean less and less as each year passes, IMO. Schools dropping out is not a good indicator for US News.
 
Some quick initial thoughts:

Seems like greater emphasis was placed on faculty:student ratio, which I personally think is a nice measure of education quality. Also the inclusion of an NIH grant awards metric for research quality was added. Personally, I'm a big fan of this addition given how shady industry funding can sometimes be. Overall, I think this year's rankings is a step in the right direction. Of course, not everyone will be happy, but the criteria for the survey seems to be shifting in the right direction.
 
Members don't see this ad :)
I've stopped paying attention to these rankings. They mean less and less as each year passes, IMO. Schools dropping out is not a good indicator for US News.
Regardless of whether or not they mean anything, people will always look towards a list to rank schools. It's human nature to want to rank and assign labels of prestige onto things. So whether or not these rankings do a good (or even subpar) job at evaluating schools, I still think they carry a lot power, especially when it comes to influencing parents and young pre-meds who don't know as much as we do.
 
  • Like
Reactions: 8 users
I’ve always felt that rankings were meaningful on large groupings such as top 10 or top 20, middle, low, etc. I think there’s some merit to those broader comparisons.

When you get more detailed though it quickly becomes ridiculous - like nothing has materially changed at any school on the list since the last rankings, yet their positions shift a bit. And even when they don’t alter the methodology much, things still shift a bit each year. But overall I think it’s helpful to see where schools compare broadly.
 
  • Like
Reactions: 4 users
Regardless of whether or not they mean anything, people will always look towards a list to rank schools. It's human nature to want to rank and assign labels of prestige onto things. So whether or not these rankings do a good (or even subpar) job at evaluating schools, I still think they carry a lot power, especially when it comes to influencing parents and young pre-meds who don't know as much as we do.
Agree. Perception can be a real thing with real impact / consequences.
 
  • Like
Reactions: 1 user
Agree. Perception can be a real thing with real impact / consequences.
As such, we should try to critique and optimize the existing ranking mechanisms we have. While it might never be perfect or even “good”, they will always be there and hold some degree of influence. Might as well try to make them as accurate and in line with the values we want to see upheld within medicine. I personally think the changes to methodology this year were a nice step towards the things I want to see emphasized in medicine (NIH funded research, greater faculty:student ratio etc.)
 
  • Like
Reactions: 1 users
Instead of surveys using more weight for research and faculty ratio is the right move. Fake prestige survey and location affinity based high scores rating was superficial. It would be interesting to see the entire T30 list which would provide better guidance for everyone as difficult to compile the data even if it publically available and compare.

Individuals can do their higher ranking for free tution but NYU remains at right level how it is ranked now from guidance perspective as compared to other institutes.
 
Last edited:
  • Like
  • Hmm
Reactions: 1 users
How dare they put HMS not in #1 or #2, as it had never been ranked below #1 :cool:
 
Last edited:
I caught part of an NPR story recently where they interviewed, among others, the CEO of USNWR and he sounded positively panicked when attempting to answer some tough questions about the value of the rankings.


(relevant interview starts at about 10:30)
 
  • Haha
  • Like
Reactions: 5 users
Adding NIH Funding and faculty students ratio is going in the right direction. On the other hand, diminishing the value of MCAT and GPA is going in the wrong direction.
 
  • Like
Reactions: 2 users
Adding NIH Funding and faculty students ratio is going in the right direction. On the other hand, diminishing the value of MCAT and GPA is going in the wrong direction.
They are forced to because some of the top schools refuse to release MCAT info, despite the data showing strong correlation between MCAT scores and USMLE results
 
  • Haha
  • Hmm
Reactions: 1 users
Members don't see this ad :)
They are forced to because some of the top schools refuse to release MCAT info, despite the data showing strong correlation between MCAT scores and USMLE results
Sorry if I sound dumb. Can’t USNEWS get the MCAT/GPA from the MSAR? It is public information, right?
 
  • Like
Reactions: 1 user
Seems like greater emphasis was placed on faculty:student ratio, which I personally think is a nice measure of education quality.
I can see this being the case at a liberal arts college where a priority is to have small classes taught by faculty (instead of TAs). As a proxy for medical student education I'm less convinced. You can have full-time, part-time, and adjunct faculty out the wazoo, that doesn't mean they are creating a better learning environment then a smaller group of education-oriented faculty. Especially since nobody goes to class anymore.

Also the inclusion of an NIH grant awards metric for research quality was added.
I'm perplexed by this "addition." Not long ago NIH awards were counted in the rankings, then they shifted to all federal grants. Now it's back to only NIH awards? And they are heralding this as some huge improvement? Or am I missing something?
 
  • Like
  • Love
Reactions: 5 users
They are forced to because some of the top schools refuse to release MCAT info, despite the data showing strong correlation between MCAT scores and USMLE results
Step is pass/fail.
 
  • Like
Reactions: 1 user
These changes are minor and barely address the substantial methodological issues that have been told to them numerous times.

They say they focus on student outcomes - but from their description, it's in words only. Faculty/student ratios are commonly misrepresented and bloated by the abundance of administrative personnel and other faculty (especially at larger institutions) that never interface with medical students; it is also not an outcomes measure. NIH grants are provided to PIs, and to students in the case of some grants like F30/31s, which are (by requirement) PhD and MD/PhD students (the latter of which represents ~2% of the MD population). Again, this is not an adequate outcomes measure.

From a previous post (note the link to the 2015 paper's methodology that does take into account outcomes by analyzing alumni):

1. US News polls data from medical school deans, and department chairs/PD directors for a "reputation" score with horrifically abysmal response rates. Not only this, but they pool from only 3 specialities (likely, again, because they get horrifically abysmal response rates. They likely prefer it - it is much less data to sort through). This leads to almost no confidence that the opinions of your sample group actually depicts the population you are attempting to represent - and for a measure that is inherently subjective to begin with (McGhahie et al., 2001; 2019).

2. Federal research dollars is the largest component of the ranking system (the "research" category). They flip-flop between considering federal NIH vs. affiliate/private organization funding in their estimate per year to vary the list, despite nothing at all changing about programs to increase or reduce the merits of their program.

3. There is little data to suggest that the strength of the research enterprise correlates with clinical outcomes. In fact, US News ranking has no correlation with clinical competency measured in residents post-graduation (Tsugawa et al., 2018).

4. Regarding research competency, there is no data provided on student outcomes for a school as a factor in their methodology. These data are not only readily available, but a group of physicians decided to show that this can be done in their off-time by publishing a paper compiling 60+ years of graduate outcomes from over 120 medical schools to assemble an alternate, research-specific ranking methodology (Goldstein et al., 2015).

The US News system fails, from its very design, to effectively measure "the best medical schools" clinically, or in research. Yes, there are better alternatives. No, they do not want to adopt them.

Sources:
1. Deans: Dump that USNWR 'best medical school' survey
2. Association between physician US News & World Report medical school ranking and patient outcomes and costs of care: observational study
3. America's Best Medical Schools: A Critique of the: U.S.... : Academic Medicine
4. America’s Best Medical Schools: A Renewed Critique of the... : Academic Medicine
5. What Makes a Top Research Medical School? A Call for a New... : Academic Medicine
 
Last edited:
  • Like
  • Love
Reactions: 12 users
The preview of the USNWR ranking more or less generates the same list of well-resourced and selective schools in the Top 15 that we have seen year after year. That is because the ranking system does not wants to factor in an important index- the economic mobility index.

This is based on the proportion of low SES students enrolled and the enhanced economic outcomes that the school provides them. It would be an incentive for med schools to rethink their poor efforts to recruit meritorious but low SES students (regardless of race, sex etc) and provide appropriate financial and merit-based aid to them.

Needless to state, this would result in more socioeconomically diverse class of students, choosing to serve underserved and HPSAs, and going into “less-profitable” specialties such as primary care and pediatrics. Med schools can and need to deliver on the promise of economic mobility for all students, while fulfilling their mission of caring for the underserved populations and improving health for all. The economic mobility index would be a good place to start.
 
  • Like
Reactions: 4 users
I think the rankings are helpful in showing that schools like Pittsburgh are as well-respected as those with marquee names like Harvard. Without the rankings, everyone would assume that the Ivies and Stanford are the best medical schools, and newer/more innovative schools would never be rewarded for their efforts.

But I say that as a graduate of a school that has steadily increased in the rankings....
 
  • Like
Reactions: 1 user
Ivies have names for undergrad but not med school in case of Brown and Dartmouth. Only rankings reveal for a layman.

Guidance is important even for high schoolers if they want to join Pitt BS MD vs Ivy Briwn B S MD where no one looks at med schools and is in Ivy prestige mode and looks at undergrad only.

Harvard ranked number one was fixed for ever. Hopkins and UPenn are equally good schools and only with rankings one can confirm.

No one would have respected Pitt just by name without rankings and folks consider Cornell Ivy med school much better with Ivy name. For most Harvard and Stanford are best medical school and rankings open their perspective unleashing their prestige hat.
 
Last edited:
  • Like
  • Hmm
Reactions: 2 users
Guidance is important even for high schoolers if they want to join Pitt BS MD vs Ivy Briwn B S MD where no one looks at med schools and is in Ivy prestige mode and looks at undergrad only.
So in this scenario the continuation of the status quo is justified based on a group of high schoolers who are smart enough to get accepted into the Pitt BS MD and Brown BS MD but apparently not smart enough to make a decision without referring to a crappy ranking system. Got it.
 
  • Like
  • Haha
Reactions: 7 users
Ivies have names for undergrad but not med school in case of Brown and Dartmouth. Only rankings reveal for a layman.

Guidance is important even for high schoolers if they want to join Pitt BS MD vs Ivy Briwn B S MD where no one looks at med schools and is in Ivy prestige mode and looks at undergrad only.

Harvard ranked number one was fixed for ever. Hopkins and UPenn are equally good schools and only with rankings one can confirm.
The ranking methodology was originally (reverse) engineered to make Harvard #1, so all this reasoning is circular.

No one would have respected Pitt just by name without rankings and folks consider Cornell Ivy med school much better with Ivy name. For most Harvard and Stanford are best medical school and rankings open their perspective unleashing their prestige hat.
I'm willing to bet most of the people in Pitt's catchment area have some grasp of it's stature. Outside of that, who really cares?

Also, in 2022 Pitt secured just over $675 million in NIH funding via 1,270 awards. If the rankings went away I think they'd find a way to get over it.
 
I can see this being the case at a liberal arts college where a priority is to have small classes taught by faculty (instead of TAs). As a proxy for medical student education I'm less convinced. You can have full-time, part-time, and adjunct faculty out the wazoo, that doesn't mean they are creating a better learning environment then a smaller group of education-oriented faculty. Especially since nobody goes to class anymore.
So maybe in the next survey we should request that they include only teaching faculty (directly involved in undergraduate medical education) as a criteria. We should provide feedback to USNews so they can continue to optimize and improve on their methodology.

I'm perplexed by this "addition." Not long ago NIH awards were counted in the rankings, then they shifted to all federal grants. Now it's back to only NIH awards? And they are heralding this as some huge improvement? Or am I missing something?
I think in recent years they included industry funding as part of research grants, which is what caused the enormous shift in rankings and NYU’s ascent to the top. Prior to that they included only federal funds for research (don’t quote me on that). Now it seems like they’re only including NIH funds, which is different from federal funds in that federal can encompass anything as long as it’s from the government (ex: the emergency funds disbursed to NYU after hurricane sandy). NIH funds refers only to money given for original research projects (usually requiring a lengthy proposal and commitment from a professor or researcher at an established academic institution).
 
Ivies have names for undergrad but not med school in case of Brown and Dartmouth. Only rankings reveal for a layman.

Guidance is important even for high schoolers if they want to join Pitt BS MD vs Ivy Briwn B S MD where no one looks at med schools and is in Ivy prestige mode and looks at undergrad only.

Harvard ranked number one was fixed for ever. Hopkins and UPenn are equally good schools and only with rankings one can confirm.

No one would have respected Pitt just by name without rankings and folks consider Cornell Ivy med school much better with Ivy name. For most Harvard and Stanford are best medical school and rankings open their perspective unleashing their prestige hat.
Pre-meds tend to have a myopic view of institutions based - in part - to the "promise" of rank lists offered by US News. They identified a market for information early on, and have sold (and continue, now, to re-sell, in a desperate attempt to maintain relevance and revenue) the idea that you can gain an "objective" sense of "better" and "worse" institutions by viewing their list and consuming their ads.

News has become a meager industry with internet dilution, and over time they have leaned into their rank lists to stay alive. It's a sad sight to behold.

What is an alternative? Well, one is to ask - "what am I interested in getting out of a medical career?" before applying, and to research schools in advance that provide a path to those goals. But hold on - what if you are a 22-year old Type-A student that has lived their early life getting perfect scores in classes, in a carrot-stick education that propelled high-horsepower memorization without any creative or introspective thought into fundamental questions? What if asking the question "what I am interested in a medical career" is so difficult because you were never taught to properly explore the question - "what do I want to do in life?"

Well, you might end up targeting a career in medicine as one of social-prestige and lucrative opportunity. You might think, anxiously and repetitively, "how do I get in; how do i get in; how do I get in to the 'best' opportunity possible?" in an extension of the simple hierarchy of grading scores you learned as a child, and reinforced as an adolescent and young adult. Such a mindset is ripely-primed for an industry "telling" you what is a better and worse opportunity. Fundamentally, like before, you had to offer no thought into what you wanted - because the list has a higher number to aim for. Therefore it is better; it doesn't matter "why" - it just matters that your perceive others want it, because it is better, so you want it.

Regardless - if this mental state is ingrained in you, and all you are interested in is aiming for these "best opportunities" as a heuristic - know that medicine is full of in-groups, and out-groups. Your offered examples - Brown, Dartmouth, Pittsburgh - are all well known institutions in the "in-group", and a rank list won't propel one above the other in career-progressed attending minds, and it will likely not influence your career. But - if possible - I would take the first step into asking deeper questions than this.
 
  • Like
Reactions: 8 users
Pre-meds tend to have a myopic view of institutions based - in part - to the "promise" of rank lists offered by US News. They identified a market for information early on, and have sold (and continue, now, to re-sell, in a desperate attempt to maintain relevance and revenue) the idea that you can gain an "objective" sense of "better" and "worse" institutions by viewing their list and consuming their ads.

News has become a meager industry with internet dilution, and over time they have leaned into their rank lists to stay alive. It's a sad sight to behold.

What is an alternative? Well, one is to ask - "what am I interested in getting out of a medical career?" before applying, and to research schools in advance that provide a path to those goals. But hold on - what if you are a 22-year old Type-A student that has lived their early life getting perfect scores in classes, in a carrot-stick education that propelled high-horsepower memorization without any creative or introspective thought into fundamental questions? What if asking the question "what I am interested in a medical career" is so difficult because you were never taught to properly explore the question - "what do I want to do in life?"

Well, you might end up targeting a career in medicine as one of social-prestige and lucrative opportunity. You might think, anxiously and repetitively, "how do I get in; how do i get in; how do I get in to the 'best' opportunity possible?" in an extension of the simple hierarchy of grading scores you learned as a child, and reinforced as an adolescent and young adult. Such a mindset is ripely-primed for an industry "telling" you what is a better and worse opportunity. Fundamentally, like before, you had to offer no thought into what you wanted - because the list has a higher number to aim for. Therefore it is better; it doesn't matter "why" - it just matters that your perceive others want it, because it is better, so you want it.

Regardless - if this mental state is ingrained in you, and all you are interested in is aiming for these "best opportunities" as a heuristic - know that medicine is full of in-groups, and out-groups. Your offered examples - Brown, Dartmouth, Pittsburgh - are all well known institutions in the "in-group", and a rank list won't propel one above the other in career-progressed attending minds, and it will likely not influence your career. But - if possible - I would take the first step into asking deeper questions than this.
I think you're selling medical students, current and former, a bit short. Recent students I have met, almost to a person, are very thoughtful about why they are in medicine, why they went where they went, and what they want to do with their careers. Much more thoughtful, I think, than some old time docs. This generation is exceptionally capable and you're selling them short to suggest they can't think creatively and independently.
 
  • Like
Reactions: 2 users
Sneak preview of this years rankings are out. Looks like US News intends to include withdrawn schools moving forward using past and publicly available information. What are people's thoughts?

The rankings are a disaster and look like a desperate attempt by USNews to stay relevant
 
  • Like
Reactions: 1 users
I think you're selling medical students, current and former, a bit short. Recent students I have met, almost to a person, are very thoughtful about why they are in medicine, why they went where they went, and what they want to do with their careers. Much more thoughtful, I think, than some old time docs. This generation is exceptionally capable and you're selling them short to suggest they can't think creatively and independently.
Not at all - I agree that there are many passionate, talented, and introspective people in the field (young and old). I was only describing an unfortunately common pipeline for many. Both exist, and neither group are impervious to change and influence from the other.
 
  • Like
Reactions: 2 users
The preview of the USNWR ranking more or less generates the same list of well-resourced and selective schools in the Top 15 that we have seen year after year. That is because the ranking system does not wants to factor in an important index- the economic mobility index.

This is based on the proportion of low SES students enrolled and the enhanced economic outcomes that the school provides them. It would be an incentive for med schools to rethink their poor efforts to recruit meritorious but low SES students (regardless of race, sex etc) and provide appropriate financial and merit-based aid to them.

Needless to state, this would result in more socioeconomically diverse class of students, choosing to serve underserved and HPSAs, and going into “less-profitable” specialties such as primary care and pediatrics. Med schools can and need to deliver on the promise of economic mobility for all students, while fulfilling their mission of caring for the underserved populations and improving health for all. The economic mobility index would be a good place to start.
LOL - why ask USNWR when med schools can do a lot more for the SES applicants. Medical schools should take concrete steps to help applicants from socioeconomically disadvantaged backgrounds. They should remove all the requirements for essays and the importance of volunteering and clinical experiences. These requirements are more barriers for SES students who may have limited resources and time to pursue extracurricular activities. Imagining spending months writing essays while needing to work...
 
  • Like
  • Okay...
Reactions: 1 users
What is an alternative? Well, one is to ask - "what am I interested in getting out of a medical career?" before applying, and to research schools in advance that provide a path to those goals. But hold on - what if you are a 22-year old Type-A student that has lived their early life getting perfect scores in classes, in a carrot-stick education that propelled high-horsepower memorization without any creative or introspective thought into fundamental questions? What if asking the question "what I am interested in a medical career" is so difficult because you were never taught to properly explore the question - "what do I want to do in life?"

Well, you might end up targeting a career in medicine as one of social-prestige and lucrative opportunity. You might think, anxiously and repetitively, "how do I get in; how do i get in; how do I get in to the 'best' opportunity possible?" in an extension of the simple hierarchy of grading scores you learned as a child, and reinforced as an adolescent and young adult. Such a mindset is ripely-primed for an industry "telling" you what is a better and worse opportunity. Fundamentally, like before, you had to offer no thought into what you wanted - because the list has a higher number to aim for. Therefore it is better; it doesn't matter "why" - it just matters that your perceive others want it, because it is better, so you want it.

Regardless - if this mental state is ingrained in you, and all you are interested in is aiming for these "best opportunities" as a heuristic - know that medicine is full of in-groups, and out-groups. Your offered examples - Brown, Dartmouth, Pittsburgh - are all well known institutions in the "in-group", and a rank list won't propel one above the other in career-progressed attending minds, and it will likely not influence your career. But - if possible - I would take the first step into asking deeper questions than this.
Lots of students have no idea what they want out of a medical career and often completely change course between applying, matriculation, and residency. Nothing wrong with aiming for a more prestigious school that allows you more opportunities and keeps more options open, even if you don't end up needing as many of them down the road once you find your path.
 
LOL - why ask USNWR when med schools can do a lot more for the SES applicants. Medical schools should take concrete steps to help applicants from socioeconomically disadvantaged backgrounds. They should remove all the requirements for essays and the importance of volunteering and clinical experiences. These requirements are more barriers for SES students who may have limited resources and time to pursue extracurricular activities. Imagining spending months writing essays while needing to work...
Reduce the importance of clinical experience ? Reduce the importance of being able to assess a candidate's ability to write? Not sure I agree with those recommendations. Maybe reduce the volume of hours / written material but not the importance.
 
  • Like
Reactions: 5 users
Reduce the importance of clinical experience ? Reduce the importance of being able to assess a candidate's ability to write? Not sure I agree with those recommendations. Maybe reduce the volume of hours / written material but not the importance.
why not - as far as I know in most of the other countries, including Japan and Europe, you don't need all these ECs to get into a medical school. And these countries do not unnecessarily have worse, in fact some of them have overall better, overall quality medical care. And why writing skills have anything to do with medical care? Sure communications skills are important but that can be addressed with in person interviews.
 
  • Okay...
Reactions: 1 users
I don't see a problem with using MCAT scores and GPA in the rank as quality of students attending the school is important.

I don't see how faculty/student ratio plays any part in educational quality as there is such a disconnect between which faculty are even involved with students. I've gotten ghosted by more people than I can count when reaching out for mentoring/research...

These ranks actually seem pretty good when looking at aggregates of 5, but I think most would consider Stanford the fifth "true" top 5 rather than WashU

Disclaimer: I don't attend Stanford or WashU
 
And all these gaps years - it is getting more and more important to have 1-2 gap years before medical school, 1-2 years of research during or after medical school before residency, and more gap before fellowships. American medical professions are throwing more and more barriers and cost to the pursuing a medical career, and adding to the overall cost of medical care in the US.
 
  • Like
  • Okay...
Reactions: 6 users
why not - as far as I know in most of the other countries, including Japan and Europe, you don't need all these ECs to get into a medical school. And these countries do not unnecessarily have worse, in fact some of them have overall better, overall quality medical care. And why writing skills have anything to do with medical care? Sure communications skills are important but that can be addressed with in person interviews.
Without clinical experience, people have no idea what they're marching into. I think it's imperative that prospective students have some amount of clinical / patient exposure.

Knowing how to write is part of becoming an educated / professional person. A doctor should be able to organize his / her thoughts, develop an argument, defend it, etc. I would not diminish the importance of either one of these.

To go a step further, I'd like to see MMI interview format used consistently. A tremendous tool to gage the ability of someone to listen, to think critically, to organize thoughts and ideas in a time constrained environment, and to communicate clearly. All four are essential to becoming a good clinician.
 
  • Like
  • Love
Reactions: 3 users
And all these gaps years - it is getting more and more important to have 1-2 gap years before medical school, 1-2 years of research during or after medical school before residency, and more gap before fellowships. American medical professions are throwing more and more barriers and cost to the pursuing a medical career, and adding to the overall cost of medical care in the US.
Nobody said becoming a doctor was going to be easy. A single gap year should be sufficient and can be in a paid position (clinical job, research associate, scribe, etc.). Some research can / should be done as an undergraduate --- bench type research often ties in nicely with undergraduate pre-med coursework (e.g. immunology lab, neurological science lab).

Some research can also be done in medical school --- some schools now require it and set their curriculum up to have protected time to do it.

I do agree that in some specialties, the expectations for research output have gotten excessive. But that's not true across the board.
 
  • Like
Reactions: 1 users
USNWR’s transformation from being the nation's favorite weekly news magazine in the print media days to becoming the kingmaker of academics chronicles the progressive importance of “ranking” for the public. Even until last year, the US News was the agreed-upon arbiter of prestige and quality in medical schools, and the top name schools that have conveniently pulled out of the rankings game this year were fine with the system until last year.
In fact, Columbia’s dean took great pride in their med school’s ranking last year, until a sudden change of heart this year, on the heels of other brand name schools withdrawing from the USNWR ranking. ( And I am not even getting into the Columbia undergrad inaccurate data from its own whistleblower that let it drop from #2 -> #18)

Legitimately, this raises questions regarding the motives of both, USNWR and the med schools, and none in a positive way. Does any of this showdown truly help aspiring meritorious premed students from low SES, or the medically underserved communities that these future physicians hope to serve?
 
Last edited by a moderator:
Legitimately, this raises questions regarding the motives of both, USNWR and the med schools, and none in a positive way. Does any of this showdown truly help aspiring meritorious premed students from low SES, or the medically underserved communities that these future physicians hope to serve?
Not sure it helps any students. Low SES or otherwise.
 
  • Like
Reactions: 1 users
Not sure it helps any students. Low SES or otherwise.
I like to agree with you, but I am focusing more on the SES piece, as the truth is that USNWR and similar rankings only seek to reinforce income inequality and status. Some of the measures such as "reputation" has nothing to do with a college's ability to educate students. These top and best-resourced schools need to care about measures that truly count for all Americans: socio-economic mobility index. Many of these ultra competitive med schools enroll from the top quintile of the income distribution compared with those at the bottom.
 
  • Like
Reactions: 1 user
Whats the main purpose of choosing a "higher ranked" school? In my mind it is to make it easier to get a residency spot in a specialty you want and in a location you want. Why are these not included in the methodology? Obviously, it would require proprietary information from schools and probably some statistical analysis, but up until a year ago they had schools eating out of their hand. If I knew beyond a doubt that School A would set me up better for residency apps than School B, that would make an impact on my decision. Marginal differences in grant funding and student:faculty ratio don't do it for me.
 
  • Like
Reactions: 1 user
Whats the main purpose of choosing a "higher ranked" school? In my mind it is to make it easier to get a residency spot in a specialty you want and in a location you want. Why are these not included in the methodology? Obviously, it would require proprietary information from schools and probably some statistical analysis, but up until a year ago they had schools eating out of their hand. If I knew beyond a doubt that School A would set me up better for residency apps than School B, that would make an impact on my decision. Marginal differences in grant funding and student:faculty ratio don't do it for me.
It's impossible to truly quantify this due to difference in interests at different schools (lower ranked schools have a higher % going into less competitive specialties, so they very well could have a higher percent of students match into their top 3 choices, but had the same % applied into competitive specialties as a highly ranked school, would they still have faired better?)

Also differences in application quality such as step scores, publications, leadership, essays would have to be controlled between schools, which is much harder than it sounds

You'd also need to control for where the applications are being sent and interview yield, If someone from Harvard applies to Harvard Columbia Stanford Duke, interviews at all four and falls to their fourth vs someone who applies to all those plus a community program, does not get interviews at those 4 and thus matches at their first rank (community program), that gives the illusion they "got what they wanted" more so than the Harvard grad who matched to their 4th. This becomes even more impossible to control if the student does not even apply to top programs because they just assume they'll get rejected, even if they'd rather go there
 
  • Like
Reactions: 1 user
Ironically, aside from Harvard being dropped to #3 (🤯 lol), their current iteration is VERY similar to how it used to be from like the early 90s to early 2010s.

Harvard was always #1 w/ Hopkins, Penn, WashU, and UCSF rounding out the top 5 and basically just those 4 schools jumping around.

Sometimes Penn was #2, other times Hopkins, other times WashU. Then Stanford started invading the top 5. USNWR will continue to release rankings and students will continue to care, even without university participation lol
 
  • Like
Reactions: 1 users
Some quick initial thoughts:

Seems like greater emphasis was placed on faculty:student ratio, which I personally think is a nice measure of education quality. Also the inclusion of an NIH grant awards metric for research quality was added. Personally, I'm a big fan of this addition given how shady industry funding can sometimes be. Overall, I think this year's rankings is a step in the right direction. Of course, not everyone will be happy, but the criteria for the survey seems to be shifting in the right direction.
Agreed with NIH grant awards being added. Also glad reputation and MCAT/GPA was dropped. They're actually going in a direction that may really assess research prowess one day in the future lol
 
Adding NIH Funding and faculty students ratio is going in the right direction. On the other hand, diminishing the value of MCAT and GPA is going in the wrong direction.
Hard agree with first sentence, hard disagree with second sentence. You can't possibly believe MCAT and GPA has anything to do with research. Let's not forget these are RESEARCH rankings, not selectivity rankings lol
 
  • Like
Reactions: 1 users
Hopkins +2 Penn +4 Harvard -2 UCSF -1 WashU +7 Columbia -3 Stanford +1 Yale +3 Duke -3 Michigan +8 Pittsburgh +3 Northwestern +5 NYU -11 Cornell 0 Mayo 0

Biggest jump Michigan + 8 up with 2022 most NIH awards for public university.

Does anyone know why NYU is -11 from last year ranking ? Still everyone has eye on NYU and this won’t matter we like zero tution.
 
  • Like
Reactions: 1 user
So maybe in the next survey we should request that they include only teaching faculty (directly involved in undergraduate medical education) as a criteria. We should provide feedback to USNews so they can continue to optimize and improve on their methodology.
So is there now a comment period for the USNWR ranking methodology? They need a bunch of randos to suggest improvements?

Look, humans obviously love rankings. They're quite pleasing as a mental shortcut. But you can only effectively rank things if you are comparing them by a very small number of criteria (ideally just one criterion). There mere idea that one can formulate a useful ranking system of something as complex and varied as medical schools, for an audience that is extremely diverse in priorities is, quite frankly, a stupid one. It doesn't work, it has never worked, and it won't ever work.

If the USNWR is looking for feedback, here is mine: take this entire endeavor out behind the woodshed, put a bullet in the back of its head, and bury it in a shallow grave. Then go find something more useful to do with your time.
 
Last edited:
  • Like
  • Love
Reactions: 2 users
So is there now a comment period for the USNWR ranking methodology? They need a bunch of randos to suggest improvements?

Look, humans obviously love rankings. They're quite pleasing as a mental shortcut. But you can only effectively rank things if you are comparing them by a very small number of criteria (ideally just one criterion). There mere idea that one can formulate a useful ranking system of something as complex and varied as medical schools, for an audience that is extremely diverse in priorities is, quite frankly, a stupid one. It doesn't work, it has never worked, and it won't ever work.

If the USNWR is looking for feedback, here is mine: take this entire endeavor out behind the woodshed, put a bullet in the back of its head, and bury it in a shallow grave. Then go find something more useful to do with your time.

And so what’s the alternative? Dump USNews and expect parents and premeds to do thorough, quality research on their own? As mentioned above, the majority of these premeds don’t know what they even want out of a medical school outside of a high paying, well respected career that offers human interaction, which lets be real, all medical schools in the country offers. In an ideal world, yes, parents/premeds/etc. would do their own research and logically come to their own conclusions as to what med school would be the best fit for them. But the reality is the vast majority have no clue when it comes to properly researching these schools, so the next best option is to have a 3rd party (like USNews) do some basic research for them, which while yes may have a lot of imperfections, certainly carries more truth than some tiger mom claiming their local state med school to be the best in the country because their friend’s daughter who got in made a post about it on WeChat (true story).
 
Last edited:
  • Like
Reactions: 1 users
Status
Not open for further replies.
Top