This will be a tad bit controversial

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

scoopdaboop

Full Member
2+ Year Member
Joined
Jun 24, 2019
Messages
791
Reaction score
885
So, I know this forum has 2 extremes. The AI nay-sayers and the AI fear mongerers. But, as someone interested in radiology (m2), I was looking at some random job offers (yes, I am ahead of myself), and saw this one with "PACS with AI integration". So I was curious as to how AI has already penetrated some peoples practices in radiology? Thanks.

Members don't see this ad.
 
Worklist prioritization is going to be the main first use of AI. For example, Nines (Radiologist Jobs in Top Quality Teleradiology Practice.) uses AI to prioritize scans that have intracranial hemorrhage and mass effect. The effect is so that you read that head CT with the intracranial hemorrhage in 5 minutes rather than 25 minutes after it's done. The side effect is that the other head CT that's negative or has hydrocephalus that you would have read in 25 minutes will instead be read in 30 minutes after it's done. It's going to benefit patient care in a difficult to measure way, because overall report turnaround time and length of stay won't change, just for the ones positive for the target conditions of the AI program, and imperfectly (overcalling) at that.

Another program is Viz.ai (Viz.ai, Inc.), which is similar to worklist prioritization, for large vessel occlusion in acute ischemic stroke, except it bypasses the radiologist to alert the stroke neurology team directly. I don't know how effective this is in real life. At my institution, which is a comprehensive stroke center, all stroke codes that warrant a CTA are going to be attended to immediately by a radiologist, usually at the scanner before images even make it to PACS, because the emergency medicine and the stroke neurology team that examined the patient is acting as the worklist prioritization for us by escalating to code stroke in the first place. The order of operations is stroke doctor examines patient, and then you get imaging, not the other way around.

Overall I'm optimistic about the utility of AI for worklist prioritization. Certainly when I have been on nights, because of the varying volumes and speed of people reading with you, there are times when the list can get really deep and you just hope there's no rapidly life-threatening condition in that stack.
 
Last edited:
  • Like
Reactions: 3 users
Worklist prioritization is going to be the main first use of AI. For example, Nines (Radiologist Jobs in Top Quality Teleradiology Practice.) uses AI to prioritize scans that have intracranial hemorrhage and mass effect. The effect is so that you read that head CT with the intracranial hemorrhage in 5 minutes rather than 25 minutes after it's done. The side effect is that the other head CT that's negative or has hydrocephalus that you would have read in 25 minutes will instead be read in 30 minutes after it's done. It's going to benefit patient care in a difficult to measure way, because overall report turnaround time and length of stay won't change, just for the ones positive for the target conditions of the AI program, and imperfectly (overcalling) at that.

Another program is Viz.ai (Viz.ai, Inc.), which is similar to worklist prioritization except it bypasses the radiologist to alert the stroke neurology team directly. I don't know how effective this is in real life. At my institution, which is a comprehensive stroke center, all stroke codes that warrant a CTA are going to be attended to immediately by a radiologist, usually at the scanner before images even make it to PACS, because the emergency medicine and the stroke neurology team that examined the patient is acting as the worklist prioritization for us by escalating to code stroke in the first place. The order of operations is stroke doctor examines patient, and then you get imaging, not the other way around.

Hey. Thanks a lot. Also, nice twitter.
 
  • Like
Reactions: 1 user
Members don't see this ad :)
The Rapid software is something we have incorporated and was a requirement in many of the acute stroke interventional trials. It is a nice tool. But, you should always look at your own imaging and make your own conclusions.

 
The Rapid software is something we have incorporated and was a requirement in many of the acute stroke interventional trials. It is a nice tool. But, you should always look at your own imaging and make your own conclusions.


Hey
Does the software have lots of FN or FP? Or is it accurate?
 
This program was used in many RCT (Dawn/Defuse3/Swift prime/Extend IA)
 
Hey
Does the software have lots of FN or FP? Or is it accurate?

Depends on your definition of lots. Also depends on the tasks the automated software is performing.

For non-con CT heads, they have a RAPID package which just says "Hemorrhage detected" or "no hemorrhage detected". I've seen it be wrong both ways. Missing subtle sub-arachnoid hemorrhage or over-calling calcs as hemorrhage. I don't even look at the RAPID summary for that. It's mainly for the clinicians but we rads were kind of annoyed when it was added to the protocol because someone who doesn't know better would think its the interpretation, not a AI feature.

For perfusion studies, I've seen the hypoperfusion/infarct values be way off when there's background morphological changes like encephalomalacia (especially multi-focal encephalomalacia). I used to think the RAPID perfusion summary maps were pretty good but a while ago my group had a case where the diffusion and ADC were both normal. RAPID called the study normal, but software (and tbh the rad too) missed the elevated mean transit time in one place signaling a hyperacute stroke.

RAPID and other AI software are only going to get better over time but it's not particularly amazing right now.
 
I haven't had much experience with CT perfusion but my perception from attending talks on it is that it's pretty noisy to begin with, so different software (eg, Rapid vs syngo.via) are going to spit out different results (occasionally very different) based on arbitrarily different parameters for thresholds and region growing. Correct me if I'm wrong, but these packages are not machine learning/deep learning/neural network-based, but manual feature engineering.

If we're going to call basically any software package "AI", which is technically correct but commonly just a marketing scheme, then other examples of computer-aided detection/diagnosis that have made it into routine clinical use include the following:
  • mammography (where is there an asymmetry/mass or calcifications?)
  • brain FDG PET (is the metabolism in this area different from norms?)
  • brain ioflupane SPECT (is the uptake in the striatum different from norms?)
  • pulmonary nodule detection (what image on chest CT has a nodule and how big is it?)
  • cardiac imaging (what is the ejection fraction? where is there infarct?)
  • I've heard of something that quantifies regional volume loss on brain MRI but I forget the vendor
There are also many examples of automated image post processing, like bone removal on chest radiograph or head and neck CTA, or vessel removal on chest CT.

The "AI" buzz is just the latest incremental advance in computer-aided image analysis in our field. Mammography CAD was FDA approved in 1998 and two decades later, it is ubiquitous but still of dubious utility. I expect to see better tools in 2040 but progress will be incremental and radiologists will have time to adapt.
 
Last edited:
I haven't had much experience with CT perfusion but my perception from attending talks on it is that it's pretty noisy to begin with, so different software (eg, Rapid vs syngo.via) are going to spit out different results (occasionally very different) based on arbitrarily different parameters for thresholds and region growing. Correct me if I'm wrong, but these packages are not machine learning/deep learning/neural network-based, but manual feature engineering.

If we're going to call basically any software package "AI", which is technically correct but commonly just a marketing scheme, then other examples of computer-aided detection/diagnosis that have made it into routine clinical use include the following:
  • mammography (where is there an asymmetry/mass or calcifications?)
  • brain FDG PET (is the metabolism in this area different from norms?)
  • brain ioflupane SPECT (is the uptake in the striatum different from norms?)
  • pulmonary nodule detection (what image on chest CT has a nodule and how big is it?)
  • cardiac imaging (what is the ejection fraction? where is there infarct?)
  • I've heard of something that quantifies regional volume loss on brain MRI but I forget the vendor
There are also many examples of automated image post processing, like bone removal on chest radiograph or head and neck CTA, or vessel removal on chest CT.

The "AI" buzz is just the latest incremental advance in computer-aided image analysis in our field. Mammography CAD was FDA approved in 1998 and two decades later, it is ubiquitous but still of dubious utility. I expect to see better tools in 2040 but progress will be incremental and radiologists will have time to adapt.

This makes me feel better. MS3 with strong interest in rads but I see all of these AI tools and I just wonder what the role of the radiologist will be in 20 years. If AI can do 30% of the job in 10-15 years, what will the job market look like? Etc.

Then again, It is what I see myself most liking so I probably should just go for it. Hard to predict the future and rads are smart people who have opportunity to add to patient care in a multitude of ways, I'd imagine.
 
  • Like
Reactions: 1 user
This makes me feel better. MS3 with strong interest in rads but I see all of these AI tools and I just wonder what the role of the radiologist will be in 20 years. If AI can do 30% of the job in 10-15 years, what will the job market look like? Etc.

Then again, It is what I see myself most liking so I probably should just go for it. Hard to predict the future and rads are smart people who have opportunity to add to patient care in a multitude of ways, I'd imagine.

I believe the more tools radiologists have to make better (accurate, reliable) interpretations, the more important radiologists will become, not less.

First, imaging will be a more important part of the diagnostic workup. Imagine an AI program that helps predict the onset of Alzheimer disease on brain PET, five years prior to the clinical diagnosis is able to be made. If there are preventive interventions in the future, this ability would create a new indication to image people, driving up the demand for radiology.

Second, radiologists will have control over these tools and know how to use them appropriately. This will be a special skill, and access to the tools will be limited to radiologists. These are both important factors. Referrers coming to the reading room often remark they are envious of how much better things look on our fancy monitors, but that is only part of the trick. The other is the years of training. Here's an analogy. Pathologists order and interpret special stains and molecular tests. The advent of these tests didn't replace pathologists looking at H&E; they created more work. They're complex, sometimes nonspecific, and need to be considered in context of other findings.

That leads me to a prior point:

I don't even look at the RAPID summary for that. It's mainly for the clinicians but we rads were kind of annoyed when it was added to the protocol because someone who doesn't know better would think its the interpretation, not a AI feature.
This is exactly the wrong way to roll out an AI tool: useless to the radiologist and yet visible to the clinician who will be mislead by wrong calls and think it trumps the radiologist's word. Radiologists need to own the clinical deployment of AI for this very reason. How did this ever happen? Rads oversee the scanners and techs. I am eager to see how startups that attempt to bypass the radiologist will end up.

The ideal deployment of AI is a result that is an unobtrusive sidebar in the PACS window, not a permanent image unless the radiologist stores it. We do this for a machine learning-based breast density assessment program for mammograms - it's one piece of information that shows up while you're reading a study, which we can choose to follow or not, and then it disappears.
 
Last edited:
  • Like
Reactions: 1 users
If AI can do 30% of the job in 10-15 years, what will the job market look like? Etc.
It is a fool's errand to try to predict the job market of any specialty 15 years from now. There could be a recession, a housing bubble, a tech bubble, a health care bubble, Medicare for All, midlevel encroachment, COVID-35, Ebola-New York. In radiology, it's not just AI, there could be a proliferation of "radiology assistants," a resurgence in neurologists and cardiologists doing imaging fellowships, new data on the safety of low dose ionizing radiation exposure, new imaging tests with more/different diagnostic uses (new radiotracers, magnetic nanoparticles, photon counting CT, ultra low dose CT, high field MRI, portable low field MRI, contrast-enhanced ultrasound, point of care sharks with laser beams attached to their heads?).

The radiology job market 10 years ago was terrible. Many medical students turned away from the field. When I was in medical school, it was a historical low in competitiveness of the field and my advising dean loved to point out at each class advising meeting the NRMP stats including how radiology is no longer a competitive field. It's a much improved picture now and those who stuck with the field they were intrinsically interested in have the last laugh.
 
Last edited:
  • Like
Reactions: 4 users
Second, radiologists will have control over these tools and know how to use them appropriately. This will be a special skill, and access to the tools will be limited to radiologists. These are both important factors. Referrers coming to the reading room often remark they are envious of how much better things look on our fancy monitors, but that is only part of the trick. The other is the years of training. Here's an analogy. Pathologists order and interpret special stains and molecular tests. The advent of these tests didn't replace pathologists looking at H&E; they created more work. They're complex, sometimes nonspecific, and need to be considered in context of other findings.

Isn't the counter point to this the fact that there is no program specifically making diagnosis/ giving clinical correlates in pathology, whereas in radiology it is happening, albeit in very limited capacity such as looking for ischemia or breast cancer etc.

But, radiologists are necessary because physicians want to collaborate with them, and they do certain procedures. So even if AI is becoming a stud, I don't think the demand for a radiologist go down.

Also, my understanding is that in private practice models, a radiology group will contract with multiple hospitals right. Sometimes these groups have 1 person reading from these hospitals for a shift. So if radiology groups already employ one physician instead of 2-3 per group of hospital for a shift idk what changes if suddenly AI comes in. You still need radiologists to work shifts at different times.
 
Isn't the counter point to this the fact that there is no program specifically making diagnosis/ giving clinical correlates in pathology, whereas in radiology it is happening, albeit in very limited capacity such as looking for ischemia or breast cancer etc.

But, radiologists are necessary because physicians want to collaborate with them, and they do certain procedures. So even if AI is becoming a stud, I don't think the demand for a radiologist go down.

Also, my understanding is that in private practice models, a radiology group will contract with multiple hospitals right. Sometimes these groups have 1 person reading from these hospitals for a shift. So if radiology groups already employ one physician instead of 2-3 per group of hospital for a shift idk what changes if suddenly AI comes in. You still need radiologists to work shifts at different times.
These programs that are in clinical use are not exactly making diagnoses. They're flagging abnormalities or coming up with some number for some physician to absorb the liability and translate into actionable words. It's not diagnosing brain ischemia, it's measuring the volume of a brain region on CT perfusion that has a Tmax (time to maximum of the residue function) greater than 6 seconds. It's not diagnosing breast cancer, it's flagging an area that may represent a mass or suspicious calcifications.

Pathology has automated Pap smear screening, which can either throw out the bottom 25% more normal looking samples or double read all samples with the cytotechnologist.

Pathology is just starting to digitize whole slides, but once that becomes popular, you can imagine the slew of potential applications for automation. Low hanging fruit would include quantification of mitoses or more reliable scoring of nuclear atypia.
 
Last edited:
  • Like
Reactions: 1 user
These programs are not making diagnoses on a clinical level currently. They're flagging abnormalities or coming up with some number for some physician to absorb the liability and translate into actionable words. It's not diagnosing brain ischemia, it's measuring the volume of a brain region on CT perfusion that has a Tmax (time to maximum of the residue function) greater than 6 seconds. It's not diagnosing breast cancer, it's flagging an area that may represent a mass or suspicious calcifications.

Pathology has automated Pap smear screening, which can either throw out the bottom 25% more normal looking samples or double read all samples with the cytotechnologist.

Ah okay I see what you saying.
 
Ah okay I see what you saying.
I thought it was informative to read fine print in the FDA approval for the viz.AI large vessel occlusion program.
"ContaCT uses an artificial intelligence algorithm to analyze images for findings suggestive of a pre-specified clinical condition and to notify an appropriate medical specialist of these findings in parallel to standard of care image interpretation. Identification of suspected findings is not for diagnostic use beyond notification."

"Images that are previewed through the mobile application are compressed and are for informational purposes only and not intended for diagnostic use beyond notification. Notified clinicians are responsible for viewing non-compressed images on a diagnostic viewer and engaging in appropriate patient evaluation and relevant discussion with a treating physician before making care-related decisions or requests. ContaCT is limited to analysis of imaging data and should not be used in-lieu of full patient evaluation or relied upon to make or confirm diagnosis."

"In all scenarios, trained radiologists read all images per standard of care, regardless of the performance of the ContaCT Device."

 
Last edited:
  • Like
Reactions: 1 user
Radiology will need more tools to reduce workload as volume rises. It has been rising faster than we've been producing radiologists for a long time and the average rad has probably reached the peak of what it means to become more efficient without sacrificing much accuracy. Most young rads are not fearful of technology but hope it will come sooner rather than later.

No one can predict what will happen in 15 years, but almost certainly imaging demands will continue to increase as newer generation CT and eventually MR becomes widely available in small centers.

I do believe it is very safe from AI penetration or midlevel infiltration because it is rarely a 'yes/no' affair'. If you are aware of what it's like to work with AI, you understand that the idea that someone will create an AI that can interpret imaging better than a rad in most/all situations, is as laughable as the idea that some pharma genius is on the cusp of creating a 'cure to all cancer' pill and eliminating the need for surgical oncologists.
 
  • Like
Reactions: 2 users
There is virtually zero AI penetration into radiology at this point and most practicing radiologists are unconcerned in the near/medium term. CAD for mammo isn't AI.
 
Top