Commentary

Article

Survey Says: AI Isn’t Going Anywhere in Oncology, But Validity Concerns Remain

Author(s):

Fact checked by:

Nearly half of oncologists use AI tools for document and communication/editing purposes in their practice, and most foresee an increase in AI integration.

AI in Oncology | Image Credit: © Cornflakesei - stock.adobe.com

AI in Oncology | Image Credit:

© Cornflakesei - stock.adobe.com

Nearly half of oncologists are utilizing artificial intelligence (AI) tools for document and communication/editing purposes in their clinical practice, and most foresee a significant increase in AI integration within the next 5 to 10 years, according to results of a survey conducted by OncLive®.

Within the survey, which generated 30 responses, 43.3% of oncologists voted that they use AI tools for document/communication/editing purposes, followed by imaging analysis (26.7%), treatment decision support systems (26.7%), pathology (13.3%), and patient-reported quality-of-life (QOL) tools (6.7%). A total of 46.7% of participants responded that they use “other” tools, which included Dragon, ChatGPT, and AlphaFold2, a structure prediction program.

In the sample size of 30 oncologists, most were academic oncologists (66.7%) followed by community medical oncologists (16.7%), surgical oncologists (6.7%), and hematologists, radiation oncologists, and other (3.3% each).

Oncologists who voted that they relied on AI tools for communication/document editing reported that they did so sometimes (23.3%), rarely (20.0%), or often (13.3%).

Additionally, when looking to the future, 63.3% of respondents said they expect to see a significant increase in AI integration compared with a moderate increase (30.0%) or its use remaining the same (3.3%). One participant voted “other,” explaining that surgical intuition is still needed.

Patrick I. Borgen, MD, chair of the Department of Surgery and head of Maimonides Breast Center, the Maimonides Cancer Center, in Brooklyn, New York, and an editorial advisory board member of OncologyLive®, interpreted the survey results in an interview with OncLive.

“Artificial intelligence is an inescapable reality for all our futures, and we are already using it for patients to book appointments and manage those appointments. Where I am currently the most interested is in decision support,” Borgen explained. “In breast cancer, at the [2024] San Antonio Breast Cancer Symposium, we saw that there are 5 individualized lines of therapy for metastatic HER2-positive breast cancer. For each step along the way, and there are about 25 steps, there’s a trial that supports the decision to use drug A, B, or C. It is not going to be possible to [make treatment decisions] with a reasonably high level of decision support, so I think [AI is] here to stay. How we employ it, how we make sure it’s safe and HIPAA compliant, and [how we ensure] that we control it and it doesn't control us, is a big part of the excitement and the challenge."

The respondents who selected that they use AI for pathology analysis replied that they use AI only for sending results for pathology examinations, AlphaFolad2 for antibody design, improving the detection of cancer of unknown primary, electronic medical records, or cytopathology purposes in cervical cancer screening as “[these tools] give better definition of equivocal pictures.”

Relating to patient-reported outcomes and QOL tools, oncologists wrote in that they use AI for QOL questionnaires (26.7%), AI-based predictive models for patient outcomes (23.3%), symptom tracking (10.0%), or other uses (53.3%), which comprised history and subjective reporting and research, with many respondents writing in that they don’t use them at all.

The survey also asked whether there were any specific AI tools or technologies oncologists hoped were available for use in practice, examples of which included:

  • Imaging definition in radiological cancer staging
  • Symptom tracking
  • Differential diagnostics
  • Treatment decisions
  • Note scribing/capture of clinic/patient encounters
  • Analysis of genomic findings
  • Clinical trials screening
  • Prediction of adverse effects
  • Radiology
  • Interpretation of CT/MRI results
  • Decision-making algorithms
  • Pathology and image analysis
  • Optimized search for finding relevant journal articles
  • Consolidated fragmented health records
  • Organizing and updating progress notes
  • Patient outcomes

One respondent said, “I would love AI to reply to portal messages, as these almost never require my expertise and take a good portion of my time for no pay,” whereas another wrote that they would like to see “a tool that can analyze clinical data and give the association between certain genes, pathways, or cell types and patient survival.”

Tools that would add value to a physician’s clinical or academic life were also discussed, and included better user-friendly programs; triaging of overnight calls, portal messages, and emails; increased efficiency of communication and research summaries and increased accuracy of pathology/radiology reports; personalized treatment algorithms; improved molecular data analysis; summarized research; diagnostic accuracy; note and health record simplification; and assistance with documentation.

Borgen noted the cost- and time-saving benefits with utilizing some of these AI tools; however, he did note a cautionary tale.

“At the upcoming [2025] Miami Breast Cancer Conference, my junior partner, Joshua Feinberg, MD, will be presenting work we did with ChatGPT and Google Gemini, [which we had take] all the various breast cancer board examinations—BESAP, SSO, and ASBrS’s pre- and post-test for fellowship. Both platforms got approximately 70% of the questions correct. That’s not bad, unless you’re using Dr Google, [in which case] there’s a 30% chance that the answer [you find] is not correct. The platforms fell with image analysis, x-ray analysis, and pathology analysis,” Borgen added. “That doesn’t mean they won’t get better in the future, but there is a cautionary note about understanding that if you’re using a platform, [you need to] understand its pros and cons and what its limitations are.”

Addressing the Skepticism With AI

The survey also included a question on whether there were any AI tools that physicians are skeptical about using in clinical practice, responses to which varied from “advantages in surgical techniques—no clear opinion,” to “algorithms. It’s hard to factor in so many patient factors.”

Other responses interestingly incorporated:

  • Patient communication and outcomes
  • Management algorithms as a supplement to tumor boards
  • Decision-making tools
  • Predictive pathology
  • ChatGPT
  • Radiomics data (given training models required)
  • AI script tools for writing notes “given concerns for HIPAA compliance”
  • Imaging

One respondent answered, “most of them. Not sure they add much,” whereas another said, “All [tools]—I don’t trust them yet.”

Borgen emphasized that human-assisted learning can help the efficacy and accuracy of AI tools in clinical practice.

“I think for someone who has a blanket reaction of, ‘I don't trust this, I don’t need it, and I don’t use it’—that’s going to be an enormous mistake in the future. AI is here to stay. It has problems. We’re using a platform called PLAUD for our various medical meetings, and PLAUD does a recording, and then it does a series of smart documents summarizing key points and action items, and it makes mistakes,” he explained. “Obviously it’s early, and the technology does require human oversight at this point, but to say that it’s not going to be the future, or ‘I'm just going to put my head in the sand because I don’t like it right now,’ is a bit naive.”

Along these lines, the survey asked what concerns, if any, do physicians have regarding the accuracy and reliability of AI tools in cancer care. The responses were also varied, with one participant replying with, “All things we must take ‘cum grano salis’.” Other responses included:

  • HIPAA and privacy
  • Reliability and validity of recommendations
  • Desire to see validation of said tools
  • Time to test them and utilize real-world studies of AI technology
  • Missing the big picture
  • Not for prescription recommendations
  • Untrustworthy

One respondent replied with, “AI tools can perpetuate biases that are built into the [large language model] that built the models, so we need to watch that,” whereas another said, “[Patients’] presentations are nuanced—not sure AI can accommodate the heterogeneity intrinsic to all our patients.”

Ethical or practical challenges with increased use of AI in oncology were also discussed, with responses such as, “We can lose humanity and close contact with patients, [and] also forget to think alone,” “Does AI integration vs direct contact in patient communication vary across socioeconomic or demographic strata? Do recommendations from AI perpetuate systemic biases?” and “Drug company interference in using AI to give their products priority.”

Other concerns included:

  • Patients’ involvement in decision-making
  • Privacy
  • Accuracy and personalization
  • No reality check
  • Confidentially
  • Inability for AI to take nuances of patient goals of care into account
  • Trust in results
  • Perpetuation of inequities and errors in the models
  • Deviations from AI-prescribed treatment algorithms will lead to legal jeopardy
  • Poor data relating to social inequalities in training of the model

In conclusion, Borgen noted that although the possibilities with AI use in oncology are exciting, the surface has barely been scratched.

“It is so much in its infancy, and like everything else, we have to control our steps going forward. We have to be smart about it. I have a self-driving car, but I don’t really let it drive itself, except in stop-and-go traffic on Staten Island. I’m not going to get on the highway and let this thing take me. With the complexities of oncology, to suggest that we’re not going to need help in how we match treatment with disease is foolish. [AI is] going to be a central part of [that process]. There are so many trials, subtleties, and nuances that are going to have to become part of the algorithm for how we treat patients going forward. [AI is] not going to put us out of a job. It’s going to make our job different.”

Reference

The Use of AI in Clinical Practice. OncLive. November 6, 2024. https://www.surveymonkey.com/results/SM-6Zkhqvj2202Umn3Ycfrw_2Bw_3D_3D/

Related Content