Abstract
Interpretable artificial intelligence (AI), also known as explainable AI, is indispensable in establishing trustable AI for bench-to-bedside translation, with substantial implications for human well-being. However, the majority of existing research in this area has centered on designing complex and sophisticated methods, regardless of their interpretability. Consequently, the main prerequisite for implementing trustworthy AI in medical domains has not been met. Scientists have developed various explanation methods for interpretable AI. Among these methods, fuzzy rules embedded in a fuzzy inference system (FIS) have emerged as a novel and powerful tool to bridge the communication gap between humans and advanced AI machines. However, there have been few reviews of the use of FISs in medical diagnosis. In addition, the application of fuzzy rules to different kinds of multimodal medical data has received insufficient attention, despite the potential use of fuzzy rules in designing appropriate methodologies for available datasets. This review provides a fundamental understanding of interpretability and fuzzy rules, conducts comparative analyses of the use of fuzzy rules and other explanation methods in handling three major types of multimodal data (i.e., sequence signals, medical images, and tabular data), and offers insights into appropriate fuzzy rule application scenarios and recommendations for future research.
Original language | English |
---|---|
Article number | 120212 |
Journal | Information Sciences |
Volume | 662 |
Early online date | 26 Jan 2024 |
DOIs | |
Publication status | Published - Mar 2024 |
Externally published | Yes |
Bibliographical note
This research was partly supported by research grants of Shenzhen Basic Research Program (JCYJ20210324130209023), Mainland-Hong Kong Joint Funding Scheme (MHKJFS) (MHP/005/20), Health and Medical Research Fund (HMRF 09200576), The Health Bureau, The Government of the Hong Kong Special Administrative Region, Project of Strategic Importance Fund (P0035421), Project of RISA Fund (P0043001) and Centrally Funded Postdoctoral Fellowship Scheme (P0045698) from The Hong Kong Polytechnic University, and by the Project of Ministry of Education ‘Chunhui plan’ cooperative Scientific Research (HZKY20220133), the national natural science foundation of Jiangsu, China under Grant BK20191200, and by the natural science foundation of Jiangsu Universities, China under Grant 19JKD520003, and by the national defense basic research program of China under Grant JCKY2020206B037 and by Jiangsu Graduate Scientific Research Innovation Project under Grant KYCX21_3506 and KYCX22_3825.Keywords
- Disease diagnosis
- Explainable artificial intelligence
- Fuzzy inference system
- Fuzzy rule
- Interpretability