Publication | Closed Access
Explanation-Driven HCI Model to Examine the Mini-Mental State for Alzheimer’s Disease
55
Citations
36
References
2022
Year
Artificial IntelligenceEngineeringMachine LearningBrain FunctionExplanation-driven Hci ModelAlzheimer ’Biomedical Artificial IntelligenceAlzheimer's DiseaseData ScienceDecision TreeNetwork PhysiologyInterpretabilityAi HealthcareMini-mental StateCognitive NeuroscienceCognitive ScienceComputational PathologyComputer ScienceComputational ModelingDeep LearningPredictive LearningNeuroimaging BiomarkersModel InterpretabilityNeuroscienceExplainable Artificial IntelligenceMedicineBrain ModelingExplainable Ai
Directing research on Alzheimer’s disease toward only early prediction and accuracy cannot be considered a feasible approach toward tackling a ubiquitous degenerative disease today. Applying deep learning (DL), Explainable artificial intelligence, and advancing toward the human-computer interface (HCI) model can be a leap forward in medical research. This research aims to propose a robust explainable HCI model using SHAPley additive explanation, local interpretable model-agnostic explanations, and DL algorithms. The use of DL algorithms—logistic regression (80.87%), support vector machine (85.8%), k -nearest neighbor (87.24%), multilayer perceptron (91.94%), and decision tree (100%)—and explainability can help in exploring untapped avenues for research in medical sciences that can mold the future of HCI models. The presented model’s results show improved prediction accuracy by incorporating a user-friendly computer interface into decision-making, implying a high significance level in the context of biomedical and clinical research.
| Year | Citations | |
|---|---|---|
Page 1
Page 1