Rad onc rankings

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
Do I dare add my rankings to the hallowed list? Apparently yes, but please be gentle if you don't agree. This is my list of what I think the "top" programs are, from the perspective of someone interested in academics.

Someone just reminded me of this discussion. Nice list -- for academics. Does it apply for community practice careers?

Members don't see this ad.
 
The classic rad onc rankings discussion.... After following this thread for a couple years, I'd like to add some thoughts:

1. SDN rankings are based largely on medical students' opinions that have interviewed or rotated at these institutions.

2. Once a program is deemed a "top" program it gets perpetually regurgitated as a top program, which makes up and coming programs difficult to break into the top tier (i.e. UCSD, which would be in my top 10).

3. Rankings that are often left out of this discussion are the U.S. News Best Hospitals for Cancer, which have validated methods of ranking cancer hospitals. The 2013-14 top ten list is as follows:

1. MDACC
2. MSKCC
3. Mayo Clinic
4. Johns Hopkins
5. Dana Farber/Brigham and Womens (Harvard)
6. Mass General (Harvard)
7. UCSF
8. U of Washington
9. Cleveland Clinic
10. Stanford

The classic rebuttal is well... just because you have a great cancer hospital, doesn't mean your rad onc department is equally as great. This is absolutely true! However, the quality of radiation oncology training will be largely influenced by the caliber of surgeons and med oncs that surround you, and every one of these programs have outstanding radiation oncology departments.

There are programs whose US News rankings are much much lower than their rad onc departments should be ranked (ie. Yale (29), Michigan (24), Wash U (21), Duke (18), U Chicago and even Penn (13)) Most of these programs are deemed top 10 programs. But I think the US News rankings are helpful in placing an unbiased value to many of these places that may or may not be valuable to future applicants.
 
  • Like
Reactions: 1 user
Someone just reminded me of this discussion. Nice list -- for academics. Does it apply for community practice careers?

I don't know if you were asking me specifically, but the truth is I have no clue whatsoever. I have asked a lot of people this question and have never been able to get a reproducible answer. Would love to hear others' thoughts on it.

There are programs whose US News rankings are much much lower than their rad onc departments should be ranked (ie. Yale (29), Michigan (24), Wash U (21), Duke (18), U Chicago and even Penn (13)) Most of these programs are deemed top 10 programs. But I think the US News rankings are helpful in placing an unbiased value to many of these places that may or may not be valuable to future applicants.

I actually did look at these cancer-center rankings when thinking about my rank-list, but in the end decided not to consider them too much. I think you're right that a good cancer center = good things for residents, but I think other, more rad-onc specific parameters are more important. Many of these things aren't really represented in the US News rankings of cancer centers. And as you noted, there are just too many cancer centers on that list with good but not great rad-onc programs, and too many "top 10" rad-onc programs not on the list.

I also agree with your points 1 and 2. There are certainly programs that are over- and under-rated by applicants, relative to what I've been told by attendings and chairs. That's why I kind of wish an "official" publication would rank rad-onc programs like US News does for graduate biology and chemistry programs, i.e. by survey of department chair "reputation scores." That's what I am most curious about, how do the people who are hiring for the jobs I want view these programs? Not that this should be anyone's single metric for making a rank-list, but it would interesting to know. I don't think there's enough of a audience for US News to do it, but maybe a journal could, like Ophthalmology Times does for ophtho programs.

I do understand that there are valid reasons for not wanting rankings (e.g. too much emphasis on "prestige," competitive spirit, etc), but I think they could be a helpful tool, as long as you don't base your whole life or rank-list on them. FWIW my rank-list definitely did not follow my rankings above.
 
Last edited:
Members don't see this ad :)
The classic rad onc rankings discussion.... After following this thread for a couple years, I'd like to add some thoughts:

1. SDN rankings are based largely on medical students' opinions that have interviewed or rotated at these institutions.

2. Once a program is deemed a "top" program it gets perpetually regurgitated as a top program, which makes up and coming programs difficult to break into the top tier (i.e. UCSD, which would be in my top 10).

3. Rankings that are often left out of this discussion are the U.S. News Best Hospitals for Cancer, which have validated methods of ranking cancer hospitals. The 2013-14 top ten list is as follows:

1. MDACC
2. MSKCC
3. Mayo Clinic
4. Johns Hopkins
5. Dana Farber/Brigham and Womens (Harvard)
6. Mass General (Harvard)
7. UCSF
8. U of Washington
9. Cleveland Clinic
10. Stanford

The classic rebuttal is well... just because you have a great cancer hospital, doesn't mean your rad onc department is equally as great. This is absolutely true! However, the quality of radiation oncology training will be largely influenced by the caliber of surgeons and med oncs that surround you, and every one of these programs have outstanding radiation oncology departments.

There are programs whose US News rankings are much much lower than their rad onc departments should be ranked (ie. Yale (29), Michigan (24), Wash U (21), Duke (18), U Chicago and even Penn (13)) Most of these programs are deemed top 10 programs. But I think the US News rankings are helpful in placing an unbiased value to many of these places that may or may not be valuable to future applicants.

Your ranking system is completely wrong. I guarantee you 99% of US radiation oncologists don't put Cleveland Clinic in top 10. And the radiation oncology departments residency program does not directly correlate with the prestige of the cancer center. I personally wouldn't even list Johns Hopkins, U of Washington in the top 10 and Cleveland Clinic in top 20.
 
Your ranking system is completely wrong. I guarantee you 99% of US radiation oncologists don't put Cleveland Clinic in top 10. And the radiation oncology departments residency program does not directly correlate with the prestige of the cancer center. I personally wouldn't even list Johns Hopkins, U of Washington in the top 10 and Cleveland Clinic in top 20.

In Radiation Heals' defense, he/she isn't saying those are the top 10 rad-onc programs. His/her point was that there are cancer-center rankings that some might find useful/interesting. He/she (that's getting old) also pointed out that there are significant discrepancies between cancer-center and rad-onc program rankings.
 
Should be an interesting bump to this thread when US News and World report releases their radiation oncology residency program rankings later today. Let the debate begin....
 
When looking at the methodology you'll see that these "rankings" need to be taken with a grain of salt. Basically, a number of board-certified radiation oncologists with doximity accounts were selected to fill out a survey asking them which 5 programs "provide the best clinical training." So basically programs that have large number of graduates and/or programs with many graduates who have Doximity accounts did well. When U. Florida and UCLA find themselves in the top 20 you have to wonder how they put this together...
https://s3.amazonaws.com/s3.doximit...ty_Residency_Navigator_Survey_Methodology.pdf
 
Last edited:
When looking at the methodology you'll see that these "rankings" need to be taken with a grain of salt. Basically, a number of board-certified radiation oncologists with doximity accounts were selected to fill out a survey asking them which 5 programs "provide the best clinical training." So basically programs that have large number of graduates and/or programs with many graduates who have Doximity accounts did well. When U. Florida and UCLA find themselves in the top 20, ranked ahead of programs like Wisconsin and UCSD, you have to wonder how they put this together...
https://s3.amazonaws.com/s3.doximit...ty_Residency_Navigator_Survey_Methodology.pdf

So really this is the same old story. Basically, people ranking their own program along with the other obvious ones at the top.
 
Members don't see this ad :)
The question is, should board certified attendings who are presumably mostly in private practice and out of touch with the current environment in academic residency programs (the only people allowed to vote in this survey) be used as the sole determiner of rad onc residency programs? I would argue this gives a snapshot into the past of which residency programs were notable maybe 10-20 years ago.
 
The question is, should board certified attendings who are presumably mostly in private practice and out of touch with the current environment in academic residency programs (the only people allowed to vote in this survey) be used as the sole determiner of rad onc residency programs? I would argue this gives a snapshot into the past of which residency programs were notable maybe 10-20 years ago.

I agree the methodology was poor.

But, is this thread any better? We have mostly MS4s ranking a lot places they didn't interview at based on hearsay and who-knows-what-else? Don't get me wrong, I gladly participated and it's a fun read, but I don't think it's the most robust ranking system.
 
  • Like
Reactions: 2 users
Agreed that methodology breaks down, but so does the methodology of rankings in this thread, despite how enjoyable it's been to read! The rankings essentially represent the opinions of doctors who participated on Doximity. As mentioned above, there were likely more private practice docs than academic, but I'm not sure of the numbers. Since 80% of residents end up going into private practice, it's possible that these rankings reflect (somewhat?) the opinion of potential hiring rad oncs, even if the opinion is swayed by prestige, etc.
 
what a joke this stuff is. The field has a pompousness problem.
 
  • Like
Reactions: 1 user
In the spirit of objective data, here are the NIH funding awards specifically to Radiation Oncology departments in Fiscal Year 2011 (in millions of $). Credit given to Emory in their presentation.

1) Columbia - 7.8
2) Stanford - 6.3
3) Washington University - 5.3
4) U Michigan - 4.6
5) Yale - 4.5
6) Rochester - 4.5
7) UCLA - 4.2
8) Pittsburgh - 4.0
9) Virginia Commonwealth - 3.9
10) Albert Einstein - 3.6
11) Penn - 3.6
12) Emory - 2.7
13) Wake Forest - 2.7
14) Iowa - 2.4
15) U Chicago - 2.3
16) UT Southwestern - 2.2
17) Alabama - 2.2
18) Duke - 2.2
19) Hopkins - 1.9
20) Vanderbilt - 1.8

I think this data only included centers directly associated with a medical school (thus why you don't see the Harvard-Hospitals, MSKCC, MD Anderson, Beaumont, Moffitt, etc). The only "big name" interestingly missing from this list is UCSF but I was glad to see the often-forgotten departments of VCU, Einstein, Wake Forest, and Iowa.

I think the only thing misleading about this data is that some of these centers are receiving funding for military/defense related research which some people may scoff at, but on the other hand, I would think that only departments with better reputations would be able to secure those big $$ funding opportunities.
 
Last edited:
In the spirit of objective data, here are the NIH funding awards specifically to Radiation Oncology departments in Fiscal Year 2011 (in millions of $). Credit given to Emory in their presentation.

1) Columbia - 7.8
2) Stanford - 6.3
3) Washington University - 5.3
4) U Michigan - 4.6
5) Yale - 4.5
6) Rochester - 4.5
7) UCLA - 4.2
8) Pittsburgh - 4.0
9) Virginia Commonwealth - 3.9
10) Albert Einstein - 3.6
11) Penn - 3.6
12) Emory - 2.7
13) Wake Forest - 2.7
14) Iowa - 2.4
15) U Chicago - 2.3
16) UT Southwestern - 2.2
17) Alabama - 2.2
18) Duke - 2.2
19) Hopkins - 1.9
20) Vanderbilt - 1.8

I think this data only included centers directly associated with a medical school (thus why you don't see the Harvard-Hospitals, MSKCC, MD Anderson, Beaumont, Moffitt, etc). The only "big name" interestingly missing from this list is UCSF but I was glad to see the often-forgotten departments of VCU, Einstein, Wake Forest, and Iowa.

I think the only thing misleading about this data is that some of these centers are receiving funding for military/defense related research which some people may scoff at, but on the other hand, I would think that only departments with better reputations would be able to secure those big $$ funding opportunities.

Funniest rankings I've seen in a while.
 
  • Like
Reactions: 1 user
I think resident publications matter, whether it is an NCI designated comprehensive center, the amount of site specific tumor boards, board scores of residents (dumb, but somewhat quantitative), location, exposure to modalities (protons, SRS, etc), whether or not you get your cases without outside rotations, and many more, but not one of these has been counted in your model.

Perhaps, but when UTSW and Moffitt opened, I don't think too many people were laughing ;) If a department is solid from a faculty, technology and volume standpoint (and has NCI designation on top of that), it's not unrealistic to think it will be a "top" program rather quickly.

Hi, as a current applicant, I wanted to pose a question to the radonc community about NCI designation and how we should consider that in our decisions as we're interviewing this season. Why/how exactly does it impact a resident's training? Does it change if you have more interest in basic or clinical research opportunities during residency, academics or PP after? I personally would like to keep as many doors open as possible. Thanks!
 
Hi, as a current applicant, I wanted to pose a question to the radonc community about NCI designation and how we should consider that in our decisions as we're interviewing this season. Why/how exactly does it impact a resident's training? Does it change if you have more interest in basic or clinical research opportunities during residency, academics or PP after? I personally would like to keep as many doors open as possible. Thanks!

It doesn't matter to a trainee. At all.
 
  • Like
Reactions: 1 user
I apologize ahead of time to those who may be annoyed by me posting in this thread, but has anything changed with reputation/"rankings'' of rad onc programs over the last 2 years? I think it is somewhat helpful for medical students applying this year and in the future.
 
I apologize ahead of time to those who may be annoyed by me posting in this thread, but has anything changed with reputation/"rankings'' of rad onc programs over the last 2 years? I think it is somewhat helpful for medical students applying this year and in the future.
There is a discussion on it in the currently google spread sheet (see 2017-2018 chit chat) . It seems like not much changed

2017-2018 RadOnc Interview Sheet
 
Just matched and speaking with students, residents, and faculty as well as interviewing at most of these places this is what I got....

Big 3
1. HROP
2. MDACC
3. MSK

A Tier
4. UCSF
5. JHH
6. Stanford
7. Michigan
8. Penn
9. WashU

B Tier
10. Duke
11. Yale
12. UChicago
 
Top