Is generic AI the future of rapid and accurate chest radiograph interpretation in the ER?


In a recent study published in the journal jama network openResearchers evaluated the accuracy and quality of chest radiograph reports generated using artificial intelligence (AI) interpretations of input images and compared them to the accuracy and quality of teleradiology and in-house radiologist reports.

Is generic AI the future of rapid and accurate chest radiograph interpretation in the ER?Study: Generative Artificial Intelligence for Chest Radiograph Interpretation in the Emergency Department. Image Credit: April Stock/Shutterstock

background

Timely interpretation of clinical data and processing of patients is a critical factor in emergency departments. Although the discrepancy between the interpretation of chest radiographs by emergency department physicians and radiologists is low, prompt interpretation of these results by radiologists can avoid problems associated with changes in treatment and callback of discharged patients. Furthermore, given the increase in the use of radiology in the emergency department for diagnosis, there is increasing interest in developing systems that can provide rapid interpretation of radiographs to streamline and expedite the treatment of patients.

The lack of dedicated radiology services or 24-hour coverage of services in free-standing emergency departments is currently addressed through teleradiology services or interpretation of radiology images by initial residents. These options present various challenges and possibilities for error due to factors such as limited access to complete clinical records. However, the use of AI to interpret clinical data is rapidly growing as a viable option in medical settings.

about the study

In the current study, the team developed an AI tool for interpreting chest radiographs and retrospectively evaluated its performance in an emergency department setting. The instrument is built on an encoder-decoder model which is transformer-based and analyzes the input of chest radiographs to provide radiology reports as output.

The test dataset for evaluating the device consisted of 500 anterior-posterior or posterior-anterior chest radiographs, for which final radiologist and teleradiology reports were also available. Chest radiographs of individuals younger than 18 years of age or older than 89 years were excluded. Reports received using AI tools, as well as teleradiology and the final radiologist, were not identified. Prior chest radiographs were also used where available.

Six board-certified physicians, regardless of report type, evaluated the AI-generated, teleradiology and final radiologist reports using additional data including information on chest radiograph images and type of image acquisition. Diagnostic accuracy and quality of the reports were assessed using a five-point Likert scale. A discrepancy or error that might result in a change in the clinical management of the patient by the physician was considered a significant finding, and assessors were required to comment on any such discrepancy.

A cumulative link mixed model was used to compare Likert scores between AI-generated, teleradiology, and final radiologist reports. The primary outcome was differences in Likert scores between the three types of reports. Secondary analyzes were conducted to determine the probability of each report type containing clinically significant discrepancies.

Result

The results suggested that the AI ​​tool for interpreting chest radiographs provided radiology reports that were similar in text quality and accuracy to the final radiologist reports and had higher text quality than teleradiology reports. The success of AI tools in providing accurate and high-quality radiology reports highlights its potential to aid diagnosis and decision making in the emergency department.

Of the 500 chest radiographs, 434 were portable anterior-posterior acquisitions, one was direct posterior-anterior, and 65 were lateral posterior-anterior films. The most common finding was infiltrates, followed by pulmonary edema, pleural effusion, presence of assistive devices, cardiomegaly, and pneumothorax. Additionally, reports that had a Likert scale score less than three were examined and discrepancies were classified as missed important findings, inconsistent findings, or inappropriately relevant findings.

In one case, the AI-generated report even improved the radiologist report, with evaluators noting that the radiologist report failed to detect a new infiltrate, which the AI ​​report did. In another case, the radiologist report observed that the opacity in the image was persistent, while the teleradiology report and AI-generated report indicated that the opacity had worsened – a finding that may be of substantial clinical significance. Is.

Compared to other tools that have used classification systems such as binary presence or absence predictions to diagnose individual pathology, the AI ​​tool and its use of prior and current chest radiographs to determine factors such as location, severity, and clinical course Also provides information on. Situation Furthermore, the AI ​​report provides relevant information, which can be used for differential diagnosis and as a basis for further assessment recommendations.

conclusion

In summary, the study reported that AI tools for rapid and immediate interpretation of chest radiographs in the emergency department setting provided radiology reports with comparable levels of quality and accuracy to radiologist reports and outperformed teleradiology reports in text quality. left. The short processing time and high accuracy of AI tools can potentially be used by emergency department physicians to assist and streamline the processing of patients.

Journal Reference:

  • Huang, J., Neal, L., Wittbrodt, M., Melnick, D., Klug, M., Thompson, M., Balitz, J., Loftus, T., Malik, S., Full, A., Weston, V., Alex, H.J., and Etemadi, M. (2023). Generative Artificial Intelligence for Chest Radiograph Interpretation in the Emergency Department. jama network open, 6(10), e2336100-e2336100. https://doi.org/10.1001/jamanetworkopen.2023.36100, https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2810195

Leave a Comment

“The Untold Story: Yung Miami’s Response to Jimmy Butler’s Advances During an NBA Playoff Game” “Unveiling the Secrets: 15 Astonishing Facts About the PGA Championship”