Bidirectional Encoder Representations from Transformers in Radiology: A Systematic Review of Natural Language Processing Applications

Larisa Gorenstein*, Eli Konen, Michael Green, Eyal Klang

*Corresponding author for this work

Research output: Contribution to journalReview articlepeer-review


Introduction: Bidirectional Encoder Representations from Transformers (BERT), introduced in 2018, has revolutionized natural language processing. Its bidirectional understanding of word context has enabled innovative applications, notably in radiology. This study aimed to assess BERT's influence and applications within the radiologic domain. Methods: Adhering to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, we conducted a systematic review, searching PubMed for literature on BERT-based models and natural language processing in radiology from January 1, 2018, to February 12, 2023. The search encompassed keywords related to generative models, transformer architecture, and various imaging techniques. Results: Of 597 results, 30 met our inclusion criteria. The remaining were unrelated to radiology or did not use BERT-based models. The included studies were retrospective, with 14 published in 2022. The primary focus was on classification and information extraction from radiology reports, with x-rays as the prevalent imaging modality. Specific investigations included automatic CT protocol assignment and deep learning applications in chest x-ray interpretation. Conclusion: This review underscores the primary application of BERT in radiology for report classification. It also reveals emerging BERT applications for protocol assignment and report generation. As BERT technology advances, we foresee further innovative applications. Its implementation in radiology holds potential for enhancing diagnostic precision, expediting report generation, and optimizing patient care.

Original languageEnglish
JournalJournal of the American College of Radiology
StateAccepted/In press - 2024


  • Language models
  • natural language processing
  • radiology


Dive into the research topics of 'Bidirectional Encoder Representations from Transformers in Radiology: A Systematic Review of Natural Language Processing Applications'. Together they form a unique fingerprint.

Cite this