Please login first
Understanding the behavior of gas sensors using explainable AI
* , ,
1  Infineon Technologies
Academic Editor: Francisco Falcone

Abstract:

Dangerous air pollutants, like ozone (O3) and nitrogen dioxide (NO2), pose major health risks. Technological advancements in low-cost gas sensors equipped with deep learning algorithms enable monitoring of such gases on a large scale. In our application, O3 and NO2 concentrations are predicted using a Gated Recurrent Unit (GRU) neural network. However, when using neural networks, there comes a challenge to make particular predictions interpretable for humans. Our research aims to address this difficulty by adopting two explainable artificial intelligence (XAI) methodologies for gas sensors to understand the reasoning behind a model’s predictions while also facilitating the characterization of sensor behavior.

The first technique quantifies the contribution of each input feature to the predictions using the Shapley Additive Explanations (SHAP) method. The features with the highest scores are considered the most important and vice-versa. Our analysis aided in dropping off the features with low scores from the model, resulting in less memory and computation resources, thus, making the model more energy efficient. It also helped enhance the quality of our sensor material. i.e., one of our core features initially scored low, indicating that the corresponding sensor’s material was under scrutiny; finally, after material improvements, that feature became one of the most impactful features.

The second approach, network dissection, tries to explain the inner-working of the network by examining the hidden state activations of each GRU unit to comprehend certain (unexpected) predictions. Our analysis demonstrated which GRU units respond to O3 and which to NO2. It also showed that for higher O3 concentrations, NO2 is masked by O3, which is also consistent with the underlying physics of the sensing material. Understanding the behavior of the dissected blocks of a neural network also helps choose an optimal number of hyperparameters for a leaner and more robust model.

Keywords: gas sensors;explainable AI;XAI;SHAP;network dissection;feature ranking
Top