Towards Interpretable Energy Estimation for Edge AI Applications

Abstract

Edge AI is gaining popularity for enabling edge devices to perform machine learning tasks. However, the resource-intensive nature of machine learning algorithms poses challenges for deploying and executing edge AI applications on resource-constrained devices. Addressing these challenges requires thorough understanding of the energy consumption behavior of machine learning algorithms in distributed edge architectures. To this end, we propose a novel methodology that models energy consumption as a weighted sum of interpretable, learnable proxies, capturing key factors such as computation, data access, and communication. Our approach leverages explainable AI techniques to interpret proxy estimates, enabling to identify the primary contributors to energy consumption in target applications. Preliminary results indicate that providing interpretable, component-level insights can effectively assist developers in making informed decisions on algorithm selection and configuration, fostering more efficient and sustainable edge AI practices.

Publication
3rd International Workshop on Intelligent and Adaptive Edge-Cloud Operations and Services, 39th IEEE International Parallel and Distributed Processing Symposium (IPDPS) 2025, June 2025