Skip to content
/ IAI Public

[WACV 2024] Decoding Radiologists’ Intense Focus for Accurate CXR Diagnoses: A Controllable & Interpretable AI System

Notifications You must be signed in to change notification settings

UARK-AICV/IAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[WACV 2024] I-AI: A Controllable & Interpretable AI System for Decoding Radiologists’ Intense Focus for Accurate CXR Diagnoses

Trong Thang Pham, Jacob Brecheisen, Anh Nguyen, Hien Nguyen, Ngan Le

About this repo

While cleaning the code for this repo takes some more time, I can still provide the data in case anyone needs it. Email your request to Trong Thang Pham - [email protected]. Note that because this dataset is built from REFLACX and MIMIC-CXR, please attach proof that you can access and download those data.

Introduction

In the field of chest X-ray (CXR) diagnosis, existing works often focus solely on determining where a radiologist looks, typically through tasks such as detection, segmentation, or classification. However, these approaches are often designed as black-box models, lacking interpretability. In this paper, we introduce Interpretable Artificial Intelligence (I-AI) a novel and unified controllable interpretable pipeline for decoding the intense focus of radiologists in CXR diagnosis. Our I-AI addresses three key questions: where a radiologist looks, how long they focus on specific areas, and what findings they diagnose. By capturing the intensity of the radiologist's gaze, we provide a unified solution that offers insights into the cognitive process underlying radiological interpretation. Unlike current methods that rely on black-box machine learning models, which can be prone to extracting erroneous information from the entire input image during the diagnosis process, we tackle this issue by effectively masking out irrelevant information. Our proposed I-AI leverages a vision-language model, allowing for precise control over the interpretation process while ensuring the exclusion of irrelevant features. To train our I-AI model, we utilize an eye gaze dataset to extract anatomical gaze information and generate ground truth heatmaps. Through extensive experimentation, we demonstrate the efficacy of our method. We showcase that the attention heatmaps, designed to mimic radiologists' focus, encode sufficient and relevant information, enabling accurate classification tasks using only a portion of CXR.

Installation

  1. Clone the repository
    git clone https:/UARK-AICV/IAI.git
  2. Navigate to the project directory
    cd IAI
  3. Install the dependencies
    bash installenv.sh

Usage

  • Pretrained Weights

    Model Config Weights
    iai configs/tsan_biomedclip_l2_mask_dice_heatmap_e2e_v2.yaml onedrive

Demo

To generate the results from a folder, please run

python demo/demo_producing_all_mask.py --input <input_folder> --output <output_folder> --config <config_file> --weights <weights_file>

About

[WACV 2024] Decoding Radiologists’ Intense Focus for Accurate CXR Diagnoses: A Controllable & Interpretable AI System

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published