Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 16 Reads
Enhancing Renewable Energy Efficiency with Artificial Intelligence

Introduction
Traditional power assets face tremendous demanding situations inclusive of pollution, greenhouse gas emissions, and the fast depletion of herbal assets, which will increase the need for sustainable answers.
Renewable energy, which includes solar, wind, and hydropower, is taken into consideration as one of the most crucial environmentally friendly options. However, it faces challenges in predicting output and efficiently dealing with grids due to climate fluctuations and manufacturing instability.
Artificial intelligence offers powerful equipment to cope with these challenges by way of analyzing big information, forecasting manufacturing, optimizing storage, and handling clever grids efficiently.
Integrating AI with renewable energy can accelerate the transition towards a low-emission financial system and enhance environmental and social sustainability.

Methodology
Data Collection: Gather manufacturing measurements from renewable power vegetation, weather records, and reading power consumption.
Data Analysis: Use machine learning techniques to forecast power manufacturing and consumption.
Modeling: Develop predictive fashions to optimize energy distribution and decrease losses.
Implementation: Integrate models into smart grids to manage assets extra efficiently.

Results
Significant improvement was achieved in the accuracy of forecasting solar and wind electricity generation, increasing by up to 20–30%, enabling better planning and optimization of renewable energy resources. Losses were reduced in intelligent electrical grids, resulting in enhanced overall grid stability, reliability, and operational efficiency. The economic feasibility of renewable energy solutions increased, encouraging wider adoption of smart grids, and promoting integration of clean energy technologies, and ultimately supporting a more sustainable and resilient energy infrastructure for the future.

Conclusion
Traditional energy sources face major challenges such as pollution, greenhouse gas emissions, and the depletion of natural resources. Renewable energy, including solar, wind, and hydropower, offers an environmentally friendly solution, but it faces difficulties in production forecasting and smart grid management. Artificial intelligence contributes by analyzing data, predicting output, and optimizing storage and grid management, thereby increasing renewable energy efficiency, reducing losses, and enhancing economic, environmental, and social sustainability.

  • Open access
  • 7 Reads
A Machine Learning Regression Model for Early-Stage Prediction of Biogas Production

Biogas production from organic waste is widely recognized as an effective option for renewable energy generation and sustainable waste management. At the same time, accurate estimation of biogas yield remains difficult because the process is influenced by several interrelated factors, including feedstock type, moisture content, and general operating conditions. These factors interact in a nonlinear manner, which limits the applicability of conventional empirical correlations and simplified mathematical models. This limitation is particularly important at the early stages of biogas project development, where quick and reasonably accurate yield estimation is required, while detailed kinetic or operational data are often unavailable. In this study, a machine-learning-based regression model is developed to predict biogas production using experimental data reported in the literature. The dataset is formed from previously published anaerobic digestion studies and includes a range of organic substrates and operating conditions. To ensure practical applicability, only a small number of commonly reported input variables are considered, mainly related to substrate composition and basic process characteristics. This choice reflects the typical level of information available during preliminary design and feasibility assessment. The data are divided into training and testing subsets using a standard train–test split in order to evaluate the predictive capability of the model for unseen data. The performance of the proposed model is evaluated using statistical indicators, with the coefficient of determination (R²) used as the main metric. The results show good agreement between predicted and experimental biogas production values. For the test dataset, the R² value is close to 0.9, indicating that the model is able to explain a large portion of the variability observed in the experimental data. These results suggest that the main relationships affecting biogas generation can be captured using a single regression-based machine learning model, without the need for complex model structures or extensive input data. The obtained results highlight the potential of data-driven regression approaches as practical tools for early-stage analysis of biogas systems. While the proposed model does not replace detailed process modeling or experimental studies, it can support preliminary feasibility analysis, comparison of different feedstocks, and initial technology assessment. Due to its simplicity and modest data requirements, the model may be useful for engineers and researchers involved in the early planning and evaluation of biogas-based renewable energy projects.

  • Open access
  • 4 Reads
Machine Learning-Based Forecasting and Adaptive Control of Energy Flow in Intelligent Electrical Networks

Modern intelligent electrical networks, commonly referred to as smart grids, operate under conditions of increasing uncertainty caused by rapidly fluctuating load demands, high penetration of renewable energy sources, and complex nonlinear interactions among system components. The integration of distributed generation units, such as photovoltaic and wind energy systems, introduces significant variability and intermittency into power supply regimes. Under such conditions, conventional forecasting and control strategies, which are typically based on static models and predefined operating rules, often fail to provide sufficient adaptability, robustness, and efficiency. As a result, power system operators face challenges related to energy losses, voltage instability, and reduced reliability of grid operation. In this context, the development of intelligent forecasting and adaptive control approaches capable of responding to real-time operating conditions has become a critical research direction for sustainable and resilient energy systems. This study develops a machine learning-based forecasting and adaptive control framework to optimize energy flow in intelligent electrical networks under uncertain load and renewable generation conditions. The methodology consists of four main stages: data preparation, forecasting model construction, adaptive control and constrained optimization, and simulation-based validation. Historical and real-time operational data are organized into an input–output dataset including load demand, renewable generation output, voltage and frequency deviations, and operational constraints. A neural network-based time-series forecasting model is employed to predict short-term load dynamics and system operating states, capturing nonlinear temporal dependencies. Based on the predicted states, an adaptive control layer computes real-time control actions using a constrained optimization strategy to minimize energy losses while maintaining system stability and satisfying technological and safety constraints. The framework supports bidirectional energy exchange between the main grid and distributed renewable energy sources and is designed for real-time operation within a smart grid architecture. Simulation studies are conducted using a representative intelligent electrical network model operating under variable load conditions and renewable energy penetration. The proposed framework is evaluated in terms of forecasting accuracy, energy efficiency, and dynamic stability and compared with conventional forecasting and control strategies. The results demonstrate that the machine learning-based forecasting model improves short-term load prediction accuracy by approximately 15–20%. In addition, the adaptive control mechanism enables more efficient energy flow regulation, resulting in a reduction in total energy losses by about 10–12%. The system also exhibits improved dynamic stability, characterized by faster convergence to steady-state operating conditions and reduced sensitivity to load fluctuations. The obtained results confirm that the integration of machine learning-based forecasting with adaptive control mechanisms significantly enhances the performance of intelligent electrical networks under uncertain operating conditions. The proposed framework improves operational stability, efficiency, and resilience and can be effectively integrated into smart grid decision-support systems and digital twin platforms. Overall, this study demonstrates the potential of machine learning-based forecasting and adaptive control for sustainable, reliable, and efficient smart grid operation.

  • Open access
  • 5 Reads
Optimizing Remaining Useful Life Prediction in Photovoltaic Systems Using AI-Enhanced LSTM Models
,

Introduction

Accurate prediction of the Remaining Useful Life (RUL) of photovoltaic (PV) systems is essential for proactive maintenance, cost reduction, and sustainable energy planning. Unlike structural health monitoring, RUL forecasting relies on modeling long-term temporal degradation patterns derived from electrical output, environmental exposure, and operational stressors. In Nigerian PV installations, high temperature variability, dust loading, and irradiance fluctuations intensify nonlinear degradation, rendering traditional statistical models insufficient. This work presents an AI-driven RUL prediction framework that systematically evaluates time-series learning models, with emphasis on Long Short-Term Memory (LSTM) networks integrated with Internet of Things (IoT) sensor data. The study is based on a systematic literature analysis and multi-criteria analytical evaluation of published photovoltaic diagnostic studies, rather than new experimental or field-generated data.

Methods

A multi-criteria analytical framework was applied to rank AI–hardware alternatives for RUL prediction using literature-derived, traceable performance metrics. Evaluated alternatives included RNN/LSTM + IoT sensors, ARIMA + sensors, ANN/DNN + I–V tracers, hybrid CNN + IoT + I–V ensembles, autoencoder-IoT models, and SVM-based approaches. Performance criteria comprised RMSE, MAE, MAPE, R², Accuracy, and Scalability. Metrics were normalized and weighted using the CRITIC method, which assigned dominant importance to R² (0.2632), RMSE (0.2584), MAE (0.2381), and MAPE (0.2367), while Accuracy received minimal weight (0.0036) due to limited discriminatory power. TOPSIS ranking was then applied to identify the most reliable RUL forecasting pipelines under real-world conditions.

Results

The TOPSIS evaluation ranked RNN/LSTM + IoT sensors as the strongest RUL prediction approach with a closeness coefficient of 0.663, significantly outperforming all alternatives. LSTM-IoT models achieved RMSE values as low as 0.0019–0.0020, MAE ≈ 1.24 kWh, and near-perfect explanatory power (R² ≈ 0.9999) across multiple studies. Hybrid CNN + IoT + I–V ensembles ranked second (Ci = 0.522), demonstrating robustness through multi-modal fusion. ARIMA-sensor models ranked third (Ci = 0.504), showing strong statistical fit but reduced adaptability to nonlinear degradation. Autoencoder-IoT methods reported extremely high accuracies (≈99–100%) but ranked lower due to weak error-based performance and limited field validation. ANN/DNN and SVM-based models consistently ranked lowest due to sparse reporting of RUL-specific error metrics and constrained scalability

Conclusion

The findings confirm that LSTM-based IoT frameworks are the most effective and field-ready solution for RUL prediction in photovoltaic systems, particularly under Nigerian environmental conditions. By minimizing prediction error while maintaining high explanatory power, LSTM-IoT models enable reliable forecasting of PV degradation trajectories and support predictive maintenance scheduling. When integrated with UAV-thermal SHM pipelines, the approach forms a unified, end-to-end diagnostic framework that addresses both immediate structural faults and long-term lifespan degradation, advancing scalable PV asset management in sub-Saharan Africa.

  • Open access
  • 12 Reads
Energy-Aware Machine Learning Framework for Multi-Class Weather Classification in Edge-Enabled Renewable Energy Systems

Accurate, low-latency weather intelligence plays a critical role in renewable energy integration, smart grid stability, and climate-resilient infrastructure planning. Conventional numerical weather prediction approaches, while physically robust, are computationally intensive and often unsuitable for localized and real-time decision-making in distributed energy environments. This work proposes an energy-aware machine learning framework for multi-class weather classification designed to support edge-enabled renewable energy systems and data-driven urban energy management. A decade-long historical meteorological dataset (2014–2023) containing hourly weather observations is utilized to train and evaluate multiple supervised learning models under a unified experimental protocol.

Four widely adopted classifiers—Decision Trees, Gaussian kernel Support Vector Machines, Feedforward Neural Networks, and Ensemble (Bagging) methods—are systematically compared using consistent preprocessing, temporal feature extraction, five-fold cross-validation, and Bayesian hyperparameter optimization. Model performance is assessed using both predictive and computational metrics, including classification accuracy, macro-averaged F1-score, ROC-AUC, training time, and inference throughput. This dual-metric evaluation enables explicit quantification of the trade-off between predictive quality and computational energy demand, which is a critical factor in edge and embedded deployment scenarios.

Experimental results indicate that Ensemble (Bagging) achieves the highest predictive performance with an accuracy of 85.44% and ROC-AUC of 0.91, demonstrating strong generalization across diverse weather categories. Gaussian SVM provides near-comparable accuracy with reduced training overhead, offering a balanced compromise between precision and computational burden. Decision Trees exhibit exceptional inference speed exceeding 590,000 observations per second, highlighting their suitability for real-time and low-power edge applications despite comparatively lower accuracy. Neural Networks deliver moderate but stable performance across all metrics, emphasizing the influence of architecture depth and optimization strategies on tabular meteorological data.

The findings reveal a clear accuracy–efficiency frontier that supports application-specific model selection for renewable energy forecasting, solar and wind resource assessment, smart city weather sensing, and distributed energy management systems. Conceptual edge-AI deployment analysis further suggests that lightweight classifiers can significantly reduce computational energy consumption while maintaining acceptable predictive reliability. This research contributes an energy-conscious, scalable machine learning approach that bridges artificial intelligence and sustainable energy infrastructures, enabling more resilient, adaptive, and environmentally responsible smart energy ecosystems.

  • Open access
  • 9 Reads
Deep Learning-Driven Real-Time Optimization for Improved Exergy Efficiency in Hybrid Renewable Energy Conversion Systems

Hybrid renewable energy systems that integrate solar, wind, and energy storage provide a promising solution for sustainable electricity generation. However, the intermittent nature of these resources often leads to inefficient energy conversion and increased thermodynamic losses, reducing exergy efficiency. Traditional rule-based control strategies struggle to adapt to rapidly changing environmental conditions. This study proposes an artificial intelligence–based framework that combines deep learning forecasting with reinforcement learning control to enhance system performance.

The proposed approach utilizes a Long Short-Term Memory (LSTM) neural network to predict short-term solar irradiance and wind speed using meteorological data from NASA POWER and historical records. These forecasts are integrated into a reinforcement learning agent based on the Proximal Policy Optimization (PPO) algorithm, which dynamically adjusts system parameters such as inverter operation, battery scheduling, and wind turbine pitch angle. The objective is to minimize exergy destruction while maintaining stable energy output.

The framework was evaluated using a simulation model of a hybrid solar–wind–battery system developed with Python-based tools, including TensorFlow. Exergy analysis was applied to quantify losses across system components. Validation was further conducted using a small-scale laboratory prototype comprising photovoltaic panels, a micro wind turbine, battery storage, and programmable converters.

Results indicate that the proposed approach achieves a noticeable improvement in exergy efficiency compared to conventional control strategies, primarily due to reduced losses in storage and conversion stages. These findings highlight the potential of AI-driven optimization for improving the efficiency and reliability of hybrid renewable energy systems.

  • Open access
  • 13 Reads
Securing Critical Energy Infrastructure: Cyber Risks, Challenges, and Defense Strategies in the Digital Era

The rapid digitalization of the energy sector has improved operational efficiency but significantly increased exposure to cyber threats against critical infrastructure. Recent analyses indicate that reported cyberattacks on energy assets more than doubled between 2019 and 2023, with ransomware accounting for nearly 40% of incidents. Individual attacks, such as the Colonial Pipeline breach, caused multi-day fuel disruptions and ransom payments totaling more than USD 4 million. Automation is rapidly expanding in energy sectors, including bioenergy and renewable energy power plants, where digital control systems improve efficiency and reliability. However, cybersecurity risks in these critical infrastructures remain less understood, with limited awareness among operators and engineers. To address this knowledge gap, this research examines potential cyber threats targeting power control systems and evaluates their impacts on process safety and energy security. Understanding these challenges is essential to ensure resilient and secure operations as facilities become more interconnected. This study provides valuable insights into emerging vulnerabilities, helping guide future risk mitigation strategies and strengthen the overall security of energy systems.

This study synthesizes cybersecurity risks, challenges, and mitigation strategies across oil and gas, electricity, nuclear, and renewable energy systems. Key challenges include securing legacy industrial control systems and virtual power plants, managing complex cloud-based and highly interconnected architectures, and addressing systemic gaps in workforce skills, governance, and security culture. The analysis highlights advanced detection and response strategies leveraging artificial intelligence and machine learning, multi-criteria decision-making methods for cyber-risk prioritization, and edge-based security architectures for distributed generation as promising technical approaches. In parallel, the study underscores the need for harmonized international standards, sector-specific regulation, continuous cyber exercises, and robust public–private partnerships to enhance resilience. Overall, the findings argue for a proactive, adaptive cybersecurity posture that integrates technical, organizational, and policy measures to safeguard critical energy assets and ensure a reliable, sustainable energy supply.

  • Open access
  • 7 Reads
Machine Learning-Based Predictive Maintenance Framework for Performance Degradation Detection in Energy Conversion Systems
,

Energy conversion systems, such as refrigeration units, electric motors, compressors, and thermal management systems, are very important in the use of energy in both industry and ordinary households. Performance degradation and component failure of energy conversion systems lead to both unexpected equipment downtime and a significant loss of energy efficiency. The typical method of maintaining these systems is either reactive or scheduled maintenance, and does not provide the tools for early detection of the degradation. In this paper, we will present an artificial intelligence-based predictive maintenance framework designed for monitoring and diagnosing Energy Conversion Systems (ECSs) using multiple sensor-generated operational data. The proposed methodology integrates a variety of parameters, including temperature, vibration, current, and environmental parameters, to develop a model of the health of ECS operating under normal conditions. Machine learning is then applied to the obtained data to identify anomalies, identify early fault signatures, and predict likely future failures prior to a major breakdown. The framework helps achieve both fault prevention and optimising energy efficiency through correlating equipment health indicators with energy performance metrics. Through experimental analyses of predictive maintenance, the findings show that not only does predictive maintenance reduce unplanned downtime, but it also exhibits verifiable improvements in system efficiency and operational stability. According to the study, the use of machine learning algorithms in predictive maintenance provides a robust framework for improving the reliability and energy efficiency of energy conversion systems. A framework that continuously learns from multi-sensor operational data allows for the identification and thus maintenance of both the early stages of performance degradation and early identification of potential failure modes. The framework's combination of health indicators and energy performance metrics makes it possible to provide a quantitative evaluation of the total amount of efficiency loss due to deterioration in energy conversion systems. With these results, it is possible to see the potential for AI-based maintenance to facilitate sustainable energy management, decrease operational losses, and build the resiliency of systems in today's smart and connected energy conversion systems.

  • Open access
  • 6 Reads
IuditAR: A Pilot Study on Overcoming Indonesia’s Household Energy Waste via AI-driven Game-As-Reality (GAR) Audits

Introduction

Indonesia faces a severe residential energy crisis, with household usage accounting for around 42% of national electricity use, and energy literacy levels remain low. It costs Indonesia an estimated 15-20 trillion rupiah annually in lost economic activity due to electrical waste, such as vampire power problems, commonly considered standby and energy waste. Current traditional energy audits, which are very costly, require trained auditors and cannot be scaled to a million homes in Indonesia, making it harder for citizens to be aware of energy-saving issues. As a result, there exists a significant intention-action gap for the average citizen to understand abstract concepts such as saving energy because they cannot easily get energy-saving education and apply their understanding to sustainable action. In this paper, we present IuditAR as a solution to the residential energy crisis in Indonesia, part of the Game-As-Reality (GAR) verse in sustainability, which employs a novel GAMERS protocol (Geste, Ambience, Mechanics, Engine, Reality, Sustainability). In contrast to passive education methods or pure gamification/game-based learning, the GAR-verse paradigm turns the home itself into a playable audit environment where the user can interact with real appliances as virtual game objects using AI-powered markerless augmented reality and obtain progress within the game upon completing verified actions. It can be a game-changer for energy-saving actions to make them more engaging and effective.

Methods

We conduct a quasi-experiment as a pilot study to evaluate the feasibility and initial user engagement of the IuditAR system using 32 participants over a one-week period. Our IuditAR utilizes AI for device detection with around 95% accuracy, markerless augmented reality for generating context-aware energy-saving missions, and giving personalized advice. It is operationalized in GAMERS protocols as follows: G (Geste), a narrative story to frame users as “Energy Guardians” on a hero’s journey; A (Ambience), a hybrid physical and digital environment to support AR panels attached to detected appliances; M (Mechanics), a mechanism to award users XP/badges only upon successful completion of actions that were verified; E (Engine), supporting technology and tools to educate; R (Reality), a connection from the program to real impact conducted by users in real life; and S (Sustainability), features to support streak mechanisms and cumulative savings dashboards for users to sustain behavior change and retention. We measure the outcomes via several indicators using KAB (knowledge, attitudes, behavior) toward energy-saving and the GAMERS framework experience scale.

Results and Conclusion

Based on our study, we found significant results in several parameters. Participants engaged with a mean of 36.5 AI scans/user across >50 household objects to identify potential savings. In the KAB (knowledge, attitude, and behavior) parameter, participants’ knowledge scores and attitudes towards energy behaviors changed significantly (p<0.0001) from the baseline, indicating that they would take action to save energy. They also demonstrated high compliance with the behavioral checklist, with more than 90% of users reporting sustaining energy-saving actions such as unplugging gadgets after charging and turning them off. Our system validates their energy-saving actions by verifying their uploaded photos. In the GAMERS experience scale, the Reality and Geste components received the highest ratings and predict participants' behavior significantly. It indicates that digital progression in a game to physical actions that have been verified by AI can be a powerful mechanism for locking in behavior changes that are not available in conventional apps or usual games/simulations.

This pilot study demonstrates the feasibility and user acceptance of the GAMERS protocol as a scalable framework for energy education. While the current results rely on self-reported data with verification in the app to verify their results and limited participants, they provide a strong foundation for future research. The next phase of development will focus on validating these behavioral changes through integration with PLN (state utility) smart meter data and conducting longitudinal Randomized Controlled Trials (RCTs) to objectively measure kilowatt-hour reduction. IuditAR represents a promising step toward scalable, AI-driven sustainability tools for the Global South, aligning with SDG 7 and Indonesia’s 2060 net-zero goals.

  • Open access
  • 11 Reads
Physics-Informed Neural Network Hysteresis Compensation for Precision Piezoelectric Energy Conversion Systems

Introduction
Piezoelectric actuators are critical components in precision energy systems, enabling applications ranging from renewable energy inspection robotics to vibration energy harvesting. However, inherent hysteresis nonlinearity degrades positioning accuracy by 10–15% of the full stroke, limiting energy conversion efficiency and operational precision. Standard proportional-integral-derivative (PID) control fails to adequately compensate for this history-dependent behavior. This work presents a hybrid physics-informed neural network (PINN) control strategy that combines analytical inverse hysteresis modeling with data-driven residual learning to achieve superior accuracy and robustness compared to purely analytical or data-driven approaches.

Methods
The system models a PI P-088.741 piezoelectric stack integrated into a flexure-guided stage, characterized by a second-order transfer function (natural frequency 3.45 kHz) and Bouc–Wen hysteresis. The control architecture comprises three cascaded components: (1) an analytical Bouc–Wen inverse model computing the baseline voltage; (2) a physics-informed residual neural network (RRN) predicting a voltage correction term; and (3) a PID feedback controller with anti-windup. The RRN implementation utilizes a compact feedforward architecture (two hidden layers, 64 neurons each, ReLU activation) trained to minimize mean-squared error on the residual voltage. Crucially, the network is physics-informed through its input features, which include reference displacement, tracking error, and physics-derived inverse voltage terms, ensuring the network learns context-aware corrections grounded in physical system states rather than black-box mapping.

Results
Validation was conducted via high-fidelity simulation under both nominal and parameter mismatch conditions. Performance metrics included RMS tracking error, transient response specifications, and frequency robustness. Against a baseline PID controller, the hybrid (inverse and PID) configuration achieved an 88–91% reduction in RMS tracking error across 1–10 Hz sinusoidal tests. The hybrid PINN variant maintained robustness under parameter mismatch, with performance degradation remaining below 10% for unseen frequencies (2–6 Hz), though slight degradation occurred at higher unseen frequencies (7–9 Hz). Transient validation using a 10 µm step command demonstrated settling times of approximately 6 ms (well within the 0.5 s specification) and negligible overshoot (<0.02%). The results confirm that the residual learning approach effectively compensates for model uncertainties without sacrificing stability.

Conclusions
This research demonstrates the feasibility of hybrid physics-informed residual learning for precision piezoelectric positioning in energy systems. By leveraging an analytical physics-based inverse as the primary compensator and a neural network for residual correction, the proposed method balances interpretability with adaptive accuracy. The validated controller enables high-precision operation under hysteresis-limited conditions, suitable for smart energy inspection and harvesting applications. Future work will focus on hardware implementation and expanding robustness validation to broader frequency ranges and amplitude variations.

Top