TY - JOUR AU - Assegie, Tsehay Admassu PY - 2022/04/22 Y2 - 2024/03/29 TI - Evaluation of the Shapley Additive Explanation Technique for Ensemble Learning Methods JF - Proceedings of Engineering and Technology Innovation JA - Proc. eng. technol. innov. VL - 21 IS - 0 SE - Articles DO - 10.46604/peti.2022.9025 UR - https://ojs.imeti.org/index.php/PETI/article/view/9025 SP - 20-26 AB - <p>This study aims to explore the effectiveness of the Shapley additive explanation (SHAP) technique in developing a transparent, interpretable, and explainable ensemble method for heart disease diagnosis using random forest algorithms. Firstly, the features with high impact on the heart disease prediction are selected by SHAP using 1025 heart disease datasets, obtained from a publicly available Kaggle data repository. After that, the features which have the greatest influence on the heart disease prediction are used to develop an interpretable ensemble learning model to automate the heart disease diagnosis by employing the SHAP technique. Finally, the performance of the developed model is evaluated. The SHAP values are used to obtain better performance of heart disease diagnosis. The experimental result shows that 100% prediction accuracy is achieved with the developed model. In addition, the experiment shows that age, chest pain, and maximum heart rate have positive impact on the prediction outcome.</p> ER -