Background: Detection is an important component of precision agriculture because accurate weed identification and treatment have a direct impact on crop yield and resource efficiency. Recent breakthroughs in artificial intelligence (AI) have enabled automatic weed recognition systems; nevertheless, standard centralised machine learning models present substantial obstacles such as high communication overhead, privacy issues, and limited scalability in remote farming contexts. To address these restrictions, federated edge learning (FEL) combined with deep learning and multimodal sensor fusion provides a viable solution by allowing for distributed model training while maintaining data privacy. Objective: In this work, our goal is to develop a privacy-preserving distributed weed detection and management system. The proposed work is integrated with FEL (Federated Learning), Deep learning with multi-modal sensor fusion to enhance the performance of the model and simultaneously minimize the data transfer, latency, and energy consumption. Materials and Methods: In this study, we used Multimodal sensors, such as LiDAR (Light Detection and Ranging), RGB (Red-Green-Blue) cameras, multispectral imaging devices, and soil moisture sensors placed in controlled agricultural plots. Each modality gave complementary information for weed identification: RGB provided texture and colour cues, multispectral collected spectral reflectance patterns, LiDAR delivered structural depth information, and soil sensors supported contextual environmental conditions. For robustness, three sensor fusion techniques were used: Early Fusion (feature-level concatenation), Mid Fusion (intermediate feature aggregation), and Late Fusion (decision-level integration). Deep learning models, such as Convolutional Neural Networks (CNNs), LSTM-CNN hybrids, and Vision Transformers, were trained using standardised parameters. A proposed Federated CNN (FedCNN) was deployed across multiple edge devices, each locally trained on sensor data without exchanging raw data, using FedAvg and FedProx algorithms. Validation was performed using a stratified 80/20 train-test split combined with 5-fold cross-validation to ensure model generalization. Model performance was assessed using accuracy, precision, recall, F1-score, AUC, latency, and energy consumption, enabling a holistic evaluation of both predictive quality and computational efficiency. The DL models, including CNNs, LSTM-CNN hybrids, and Vision Transformers, are used. A FedCNN model is distributed across many edge nodes, allowing for decentralized training without exchanging raw data. For model performance measures, we used different metrics like accuracy, precision, recall, F1-score, AUC, latency, and energy. Result: The experimental work reveals that the model FedCNN performs well in comparison to other models and achieved the highest accuracy of 94.1%, precision is 94.3%, recall is 93.9% and F1-score is 94.1%, AUC is 94.1% during hybrid fusion strategies. We compared the centralized and federated learning performance. The FEL (Edge) accuracy is 94.1%, the Latency is 120 ms, the energy consumption is 300 (mWh), and the privacy risk level is low. Conclusion: The combination of FEL and multi-modal sensor fusion provides a reliable and scalable approach for weed detection in precision agriculture. By processing data locally and collaboratively at the edge, the system achieves high accuracy, decreases response time, lowers energy consumption, and preserves data privacy.
Previous Article in event
Previous Article in session
Next Article in event
Federated Edge Learning for Distributed Weed Detection in Precision Agriculture Using Multimodal Sensor Fusion
Published:
07 November 2025
by MDPI
in The 12th International Electronic Conference on Sensors and Applications
session Smart Agriculture Sensors
https://doi.org/10.3390/ECSA-12-26608
(registering DOI)
Abstract:
Keywords: Deep learning;FEL;Sensor Fusion; Weed Detection; Precision Agriculture
