Leveraging explainable AI for informed building retrofit decisions: insights from a survey

  • Accurate predictions of building energy consumption are essential for reducing the energy performance gap. While data-driven energy quantification methods based on machine learning deliver promising results, the lack of Explainability prevents their widespread application. To overcome this, Explainable Artificial Intelligence (XAI) was introduced. However, to this point, no research has examined how effective these explanations are concerning decision-makers, i.e., property owners. To address this, we implement three transparent models (Linear Regression, Decision Tree, QLattice) and apply four XAI methods (Partial Dependency Plots, Accumulated Local Effects, Local Interpretable Model-Agnostic Explanations, Shapley Additive Explanations) to an Artificial Neural Network using a real-world dataset of 25,000 residential buildings. We evaluate their Prediction Accuracy and Explainability through a survey with 137 participants considering the human-centered dimensions of explanationAccurate predictions of building energy consumption are essential for reducing the energy performance gap. While data-driven energy quantification methods based on machine learning deliver promising results, the lack of Explainability prevents their widespread application. To overcome this, Explainable Artificial Intelligence (XAI) was introduced. However, to this point, no research has examined how effective these explanations are concerning decision-makers, i.e., property owners. To address this, we implement three transparent models (Linear Regression, Decision Tree, QLattice) and apply four XAI methods (Partial Dependency Plots, Accumulated Local Effects, Local Interpretable Model-Agnostic Explanations, Shapley Additive Explanations) to an Artificial Neural Network using a real-world dataset of 25,000 residential buildings. We evaluate their Prediction Accuracy and Explainability through a survey with 137 participants considering the human-centered dimensions of explanation satisfaction and perceived fidelity. The results quantify the Explainability-Accuracy trade-off in building energy consumption forecasting and how it can be counteracted by choosing the right XAI method to foster informed retrofit decisions. For research, we set the foundation for further increasing the Explainability of data-driven energy quantification methods and their human-centered evaluation. For practice, we encourage using XAI to reduce the acceptance gap of data-driven methods, whereby the XAI method should be selected carefully, as the Explainability within the methods varies by up to 10 %.show moreshow less

Download full text files

Export metadata

Statistics

Number of document requests

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Daniel Leuthe, Jonas Mirlach, Simon Wenninger, Christian Wiethe
Frontdoor URLhttps://opus.bibliothek.uni-augsburg.de/opus4/113620
ISSN:0378-7788OPAC
Parent Title (English):Energy and Buildings
Publisher:Elsevier BV
Type:Article
Language:English
Year of first Publication:2024
Publishing Institution:Universität Augsburg
Release Date:2024/06/25
First Page:114426
DOI:https://doi.org/10.1016/j.enbuild.2024.114426
Institutes:Fakultät für Angewandte Informatik
Fakultät für Angewandte Informatik / Institut für Informatik
Dewey Decimal Classification:0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik
Latest Publications (not yet published in print):Aktuelle Publikationen (noch nicht gedruckt erschienen)
Licence (German):CC-BY-NC 4.0: Creative Commons: Namensnennung - Nicht kommerziell (mit Print on Demand)