Boards...postmortem. WTF?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

Finally M3

Senior Member
15+ Year Member
20+ Year Member
Joined
Jun 28, 2002
Messages
970
Reaction score
2
No questions or topics...but anyone else want to vent?

There seemed to be a percentage of questions that, if I had a month straight without working to study for, I still would not have gotten close to answering because I had no idea it was in the scope of PM&R.

:laugh:

Members don't see this ad.
 
No questions or topics...but anyone else want to vent?

There seemed to be a percentage of questions that, if I had a month straight without working to study for, I still would not have gotten close to answering because I had no idea it was in the scope of PM&R.

:laugh:

yeah...totally...those questions were totally out of left field!
 
Last edited:
I heard the same thing from one of our pain fellows and another grad from last year. Quite a few psych-type questions (unexpectedly) and some obscure agnosia questions too. Not classic PM&R questions like the SAE mostly is. And why don't we have any great review books for that yet. Must be a massive curve each year though b/c of the high first-time pass rate.

axm and vlad, what did u think??
 
Members don't see this ad :)
I heard the same thing from one of our pain fellows and another grad from last year. Quite a few psych-type questions (unexpectedly) and some obscure agnosia questions too. Not classic PM&R questions like the SAE mostly is. And why don't we have any great review books for that yet. Must be a massive curve each year though b/c of the high first-time pass rate.

axm and vlad, what did u think??

"Classic PMR questions like the SAE mostly is"???

Surely you jest.
 
I thought quite a few of the questions seemed pretty outdated. I tried to answer everything like I would have had I done residency 5-10 years ago; hopefully that was the right approach. I also felt some topics that are marginally relevant to rehab at best were very much overrepresented, whereas other topics more pertinent to what we do did not appear at all. It was really quite odd. I'm not sure if some of the really stale questions were the infamous "qualifier" questions they use to figure out if people are cheating each year - that's what I was thinking as I left the exam. Some questions were clearly out of the scope of PM&R - not sure why this test and the SAE never have "consult ____ specialist" as an answer, when clinically that is what is most appropriate. Oh well, at least it's over. I'm sure everyone did fine.
 
"Classic PMR questions like the SAE mostly is"???

Surely you jest.

Believe it or not...I actually do think that the SAE is a more "PM&R" focused exam...never thought I would say that. At least you can kind of see why they ask the questions they do on the SAE. This test that I took on monday was the weirdest test I have ever taken. Questions were outta left field, and not only that, the wording of the questions themselves was terrible. I almost felt as though if I didnt study at all...it wouldnt have made a difference. But...it is over...hopefully I passed :rolleyes:
 
Believe it or not...I actually do think that the SAE is a more "PM&R" focused exam...never thought I would say that. At least you can kind of see why they ask the questions they do on the SAE. This test that I took on monday was the weirdest test I have ever taken. Questions were outta left field, and not only that, the wording of the questions themselves was terrible. I almost felt as though if I didnt study at all...it wouldnt have made a difference. But...it is over...hopefully I passed :rolleyes:

Perhaps a core issue revolves around what is a poor question vs a good question. I define good questions as those that both reflect the content of material that ought to be mastered, and provide an opportunity for a bit of separation of the test takers into differing degrees of mastery (ie better mastery-->higher scores).

I have always had a problem with the SAEs. Perhaps this is because of their tie in with the Study Guides, which are far too brief to adequately address the topics they are supposed to cover. I will concede, however, that you can generally see where the questions are derived. (Not always--I know of two questions citing only articles that I wrote-->and I had to reread my own articles to figure out how they obtained the answer!! You could say that perhaps my memory is fading, but the questions really focused on less important aspects of the cited articles.)

I did not have to take the recertification exam, so I really can't say which of the two exams had poorer, more obscure questions. Reading the posts, it seems that there were several questions that either came out of the proverbial left field, the stands, or perhaps the parking lot.

Ah yes. The joy of being the last of the board-certified-forever physiatrists. :laugh: Sorry, I shouldn't be so happy, but I couldn't help myself. :D
 
I will cross my finger before saying this but overall I thought the exam was fair. Some questions seemed outdated and a few are so poorly worded. Cuccurulo will say one thing but the current literature will say another. Oftentimes, I think that the answer depends on who wrote it (i.e. someone from PASSOR or an inpatient physiatrist). No offense to pediatric physiatrists but thankfully there were far less peds questions than I expected. It seems that the ABPMR is trying to make an attempt to catch up with the current shift of paradigm in our field. The exam seemed a but more outpatient focused.

The SAE questions were not quite written the same as the boards. Personally, if you did very well on the SAEs then the boards should not be as bad. Well, we will see.
 
The out of the blue questions are typically the afternoon sessions which represent experimental questions. Most people feel pretty weird or bad coming out of the test but the statistics show that the majority pass every year. I wouldn't worry too much about it.
 
So I know you can't talk about specific topics or questions, but could someone at least answer one for me:

Which of the following is an effective way to prepare for the boards:
A. Read and know one of the major texts cover-to-cover
B. Know Cuccurrello inside-out
C. SAE questions, SAE questions, SAE questions
D. Study the entire body of knowledge from orthopedics, neurology, rheumatology, pain medicine, pharmacology, and any other -ology remotely related to pm&r
E. Forget about studying. Your guess is as good as mine what's going to show up.
 
So I know you can't talk about specific topics or questions, but could someone at least answer one for me:

Which of the following is an effective way to prepare for the boards:
A. Read and know one of the major texts cover-to-cover
B. Know Cuccurrello inside-out
C. SAE questions, SAE questions, SAE questions
D. Study the entire body of knowledge from orthopedics, neurology, rheumatology, pain medicine, pharmacology, and any other -ology remotely related to pm&r
E. Forget about studying. Your guess is as good as mine what's going to show up.

Def E...I studied...and I have no idea what the hell that was. The minutia they ask on the boards are absurd...even if you study cucurullo cover to cover...the small details they ask are barely mentioned in the review books, and if they are, they are the smallest details and not major concepts. There are also a handful of non - rehab related questions on there as well...I just hope I dont have to take this stupid thing again...
 
The out of the blue questions are typically the afternoon sessions which represent experimental questions. Most people feel pretty weird or bad coming out of the test but the statistics show that the majority pass every year. I wouldn't worry too much about it.

Great...I actually felt better about the afternoon session than the morning session...:mad:
 
Knowing Cuccurullo is probably the best thing to do. The majority of questions I remembered after the test had the answer somewhere in Cuccurullo, although most did not have the "book" label next to them. I tried looking up some of the topics tested in Braddom when I got home from the exam, and most were NOT in it. Didn't check DeLisa, which would be hard to learn cover to cover. I personally went through SAE exams more to help me figure what I needed to study better- the format of the questions is very different on the Boards (i.e. you actually have to think when you take the SAE whereas on the Boards you either know things or you don't). As outdated as Cuccurullo is, the Boards were even more out of date. Reading recent literature in related specialties would probably be a waste of time (with regards to studying for Boards at least), because they aren't testing current medicine.
 
Members don't see this ad :)
Great...I actually felt better about the afternoon session than the morning session...:mad:

Well keep in mind I took the test 3 years ago so things may have changed.. they may have mixed in experimental questions in there more randomly now...

Standardized multiple choice exams tend to make people feel weird or horrible walking out of there. Some of the smartest people I know (or at least the best test takers for that matter), walk out of Board exams and such convinced they've failed or done badly.

The scores of people I've mentored and interacted with in the past have revealed to me that people generally do better on these tests than they think they do.

I remember after taking the written boards there were many questions all of us felt were not relevant at all.
 
it's bad when there are words on a test that you never encoutered during 4 years of college, 4 years of medical school, and 4 years of residency.

That being said, i would say that about 60% of the test had questions on topics covered in cuccurullo. Definitely many of them had the book icons. Some things didn't have the book icons. I actually put in marathon sessions with a co-fellow covering cuccurullo from cover to cover in 18 hours. I did read the book by myself and with others about two times prior to doing that. Having someone else study cuccurullo with you is great because things you pay attention to, they may think it's not important and vice versa.

the remaining 40% of the exam - 20% were more "advanced" questions that perhaps were beyond the scope of cuccurullo - perhaps those answers could be found in delisa or braddom. the other 20% were just out of no where with no relevance to PM&R that I could think of - even with my wild imagination.

So bottomline - cuccurullo is enough to pass the exam. If you want the Elkins, you can either cheat and find a program with old board questions or just be a trivia genius - like the people who compete on jeopardy. There were probably about 10 questions that were worded so weirdly I had NO idea what the question was even asking much less the answer choices.

I was aiming for the pass mark and I am pretty sure I made that cut. We'll see... :scared:

My plan is to make a lot of money so I can start an alternative award - for the bottom person passing the exam that year (just one above the failing person).
 
So bottomline - cuccurullo is enough to pass the exam. If you want the Elkins, you can either cheat and find a program with old board questions or just be a trivia genius - like the people who compete on jeopardy.

So, which factor do you think typically has the most impact on the Elkins award each year?


I vote for choice a.
 
Aviator; go with E. Instead of a board review course, goto the Dominican or Cancun and lounge about a beach with an umbrella'd drink.

I chose B & C out of your options. With mixed results.

And btw, I felt much better about the afternoon session...but many of my 'narrowed down to two' questions were wrong in the afternoon, but right in the morning, so who the hell knows.
 
the ridiculousness of the boards (both written and oral) underscores a major identity crisis we have in our specialty. there really isnt a true core or knowledge base from which a comprehensive exam can be derived. instead, we get asked questions that are so esoteric as to be useless. i think that the boards are generally aimed to keep out some of the riff-raff that really have no business practicing medicine. it is not an accurate guage of a competent physiatrist......
 
i think many of the questions are not very relevant to clinical practice because anything controversial that can be challanged will not be tested. That leaves things like genetic/stroke/SCI syndromes, P&O components, gait mechanics, anatomy, and esoteric facts that cannot be disputed.

so what you do day to day - making clinical decisions and working with patients, running team conferences, family meetings, performing interventional procedures, etc. really are not very relevant to this exam.

Hindsight is 20/20 but if I could do it over, I would get Cuccurullo my PGY2 year, read through the chapter of the rotation I am on, and supplement further knowledge with Delisa/Braddom. Had I done that throughout my residency, I think I would have had a lot less cramming to do last minute.
 
Hindsight is 20/20 but if I could do it over, I would get Cuccurullo my PGY2 year, read through the chapter of the rotation I am on, and supplement further knowledge with Delisa/Braddom. Had I done that throughout my residency, I think I would have had a lot less cramming to do last minute.

Excellent advice. Know what you need to before you need to. THen when (if) you are being taught it, the knowlede is reinforcement and not learning.

I also like Sugar and Choi's pockepedia, the old test question databank that every program has had but seems to be kept hush hush. Password is swordfish- they'll give you a copy.
 
Excellent advice. Know what you need to before you need to. THen when (if) you are being taught it, the knowlede is reinforcement and not learning.

I also like Sugar and Choi's pockepedia, the old test question databank that every program has had but seems to be kept hush hush. Password is swordfish- they'll give you a copy.

I thought the password was "Rosebud". No wonder why I got in so much trouble.
 
i think many of the questions are not very relevant to clinical practice because anything controversial that can be challanged will not be tested. That leaves things like genetic/stroke/SCI syndromes, P&O components, gait mechanics, anatomy, and esoteric facts that cannot be disputed.

so what you do day to day - making clinical decisions and working with patients, running team conferences, family meetings, performing interventional procedures, etc. really are not very relevant to this exam.
.

Respectfully, I disagree here. Actually, I think the breadth of our specialty would make it fairly straightforward to derive "clusters" of fair questions that address common/important clinical issues that the physiatrist will encounter.

Maybe I need to volunteer to be a question maker.
 
The primary purpose of board certification of medical specialists is to assure the public of physician knowledge and skill. I did not think the board examination was difficult or unreasonable, but I was very disappointed in the quality of the questions. I would agree that the type of questions in old SAE tests do a better job of testing knowledge pertinent to the practice of our specialty. The test this year (no intended offense to any of the test makers) was more of an assessment of the test takers ability to do well on Jeopardy. The overwhelming majority of the questions did not require much thought which unfortunately does not meet, in my estimation, the primary purpose of the board certification process in assuring the public of physician knowledge and skill.
 
The way I see it, part I of the boards, being multiple choice format, is geared to test primarily medical knowledge (recognizing symptoms, complications of treatment, etc). If your knowledge is not sufficient, then you don’t get to move on to part II.

Part II, the orals, I think is being redesigned to test the other components of being a competent physician – i.e. the other core competencies as outlined by the ABMS/ACGME: patient care, communication, professionalism, practice-based learning, system-based practice. Recent takers tell me the cases are more standardized, where in the past the oral examiners could ask you about pretty much anything they wanted. (Flashback - I had one guy break out a slide carousel and showed a variety of x-rays, photos of prostheses, physical signs - and basically asked me, "what's this...what's this... what's this...")

Whether the boards do an adequate job is assessing these competencies is another issue. Clinical relevance seems is a recurrent theme in the criticism of the boards. Part of the problem I think is the breadth of our field (I agree with axm on this one). If you get a question writer who’s subspecialty is - say cardiopulmonary rehab, or cancer rehab, or peds neuromuscular disorders - what may be clinically relevant for him/her may be considered esoterica for the rest of us. And as axm alluded to, can’t test on recent, controversial stuff where the evidence base is shaky. So the majority of questions become recall type questions, cause they're easy to write and fact check.

What I've noticed - is that nowhere is there a place to assess competency on procedures. The EMG boards did to a limited extent. Can't speak about the pain boards. But the rehab boards - not that I recall.
 
The way I see it, part I of the boards, being multiple choice format, is geared to test primarily medical knowledge (recognizing symptoms, complications of treatment, etc). If your knowledge is not sufficient, then you don’t get to move on to part II.

Part II, the orals, I think is being redesigned to test the other components of being a competent physician – i.e. the other core competencies as outlined by the ABMS/ACGME: patient care, communication, professionalism, practice-based learning, system-based practice. Recent takers tell me the cases are more standardized, where in the past the oral examiners could ask you about pretty much anything they wanted. (Flashback - I had one guy break out a slide carousel and showed a variety of x-rays, photos of prostheses, physical signs - and basically asked me, "what's this...what's this... what's this...")

Whether the boards do an adequate job is assessing these competencies is another issue. Clinical relevance seems is a recurrent theme in the criticism of the boards. Part of the problem I think is the breadth of our field (I agree with axm on this one). If you get a question writer who’s subspecialty is - say cardiopulmonary rehab, or cancer rehab, or peds neuromuscular disorders - what may be clinically relevant for him/her may be considered esoterica for the rest of us. And as axm alluded to, can’t test on recent, controversial stuff where the evidence base is shaky. So the majority of questions become recall type questions, cause they're easy to write and fact check.

What I've noticed - is that nowhere is there a place to assess competency on procedures. The EMG boards did to a limited extent. Can't speak about the pain boards. But the rehab boards - not that I recall.

Im gonna have to disagree with the first part of what you said. I do not think that part I tested overall medical knowledge...at least not from a rehab point of view. The questions were poorly written, and they seemed to be overly concerned with crafting "cute" questions in attempts to divert the test taker from choosing the correct answer. The whole "narrow down to two" answers idea is fine...so long as the body of the question allows one to reasonably pick the "correct" answer. That was not, in my opinion, the case with this exam. I typically do fairly decent on standardized tests, and usually am pretty on target with how I performed...but not with this test. I honestly have no idea...but I can remember a significant number of questions that I was iffy on, not because of the subject matter, but because of the structure of the questions themselves. Anyway, whatevs...outta my hands now...just gonna have to wait and pray...:confused:
 
Respectfully, I disagree here. Actually, I think the breadth of our specialty would make it fairly straightforward to derive "clusters" of fair questions that address common/important clinical issues that the physiatrist will encounter.

Maybe I need to volunteer to be a question maker.

YOu could be considered after 5 years of being Boarded.
THey do this to keep all the angry hot heads from making the test "useful".
Guess how I know?


Only 2 years til I'm eligible.
 
The primary purpose of board certification of medical specialists is to assure the public of physician knowledge and skill.

How can a multiple choice test ever possibly estimate clinical judgment and skill? Completing residency should be adequate assurance to the public of this.
This method of testing for specialization certification is a farce and a scam. Medicine should look at other scientific disciplines where a culmination of work combined with a final skill check/project/discert is the last hurdle.
 
How can a multiple choice test ever possibly estimate clinical judgment and skill? Completing residency should be adequate assurance to the public of this.

"should be" - operative words. But it isn't. Plus, even idiots can graduate a residency, I saw plenty.

This method of testing for specialization certification is a farce and a scam. Medicine should look at other scientific disciplines where a culmination of work combined with a final skill check/project/discert is the last hurdle.

Who would do the testing then? Your residency director? S/He has a vested interest in your passsing it = bias and confict of interest. If you're referring to a discertation for a PhD, they are with one doc essentially their entire training, then tested by the other docs in the dept. We don't have that - we meet and work with everyone in the dept. So you'd have to go to an outside tester, and then you'd have to have it standardized, and that would create regulations, more things in writing, and you'd end up where we are now.
 
How can a multiple choice test ever possibly estimate clinical judgment and skill? Completing residency should be adequate assurance to the public of this.
This method of testing for specialization certification is a farce and a scam. Medicine should look at other scientific disciplines where a culmination of work combined with a final skill check/project/discert is the last hurdle.

The issue of creating an optimal "maintenance of certification" (MOC) program is controversial, to be sure. I think it is especially challenging in a field as broad as ours. I seriously question which factors carry the greatest influence with our Board (ABPMR) regarding these issues. Is there any serious effort to confer with our specialty members to get a sense of their perspectives on these issues? Is there really evidence to support the expense/effort involved in MOC? Speaking only for myself, I remain unconvinced that the hassles/expense associated with re-examination each decade are worthwhile, particularly for those of us whose practices have evolved most closely into subspecialties. In those instances, recertification in the subspecialty seems more reasonable.

If it is of any comfort to anyone in this forum, our neurology colleagues appear to be dealing with some of these challenges in their MOC. For those that might be interested, I have included citation info from this week's issue of Neurology, regarding two interesting articles about their MOC program.

John R. Corboy and Mitchell S.V. Elkind. Maintenance of certification: Ready or not, here it comes. Neurology 2008 71: 544-545.

L. R. Faulkner, P. W. Tivnan, M. V. Johnston, M. J. Aminoff, P. K. Coyle, P. K. Crumrine, S. T. DeKosky, R. Jozefowicz, J. M. Massey, and R. M. Pascuzzi
Invited Article: The ABPN maintenance of certification program for neurologists: Past, present, and future. Neurology 2008 71: 599-604.
 
when are those of us who took written boards going to get results? it's been a month today....
 
Still no results...?
 
Now I am really anxious to get my score back.
 
Top