Please login first
Enhancing Live Fence Detection through Foundation Model Integration: A Scene-Level Deep Learning Approach
* 1 , 2 , 1
1  Remote Sensing | Spatial Analysis lab (REMOSA), Department of Environment, Ghent University.
2  Facultad de Ingeniería en Eléctrica y Computación, ESPOL Polytechnic University, Escuela Superior Politécnica del Litoral, Guayaquil, Ecuador, dochoa@fiec.espol.edu.ec
Academic Editor: Fabio Tosti

Abstract:

Monitoring live fences in agroforestry landscapes is crucial for understanding ecosystem connectivity and biodiversity conservation, yet traditional detection methods struggle with their complex spatialspectral characteristics. Building upon our previous work on multi-stream deep learning for live fence detection, which achieved over 83% accuracy, we propose a novel approach integrating foundation models to enhance scene-level classification capabilities. Our framework combines specialized vegetation detection features with pre-trained visual knowledge through a dual-stream architecture while leveraging optimal spectral band configurations. The methodology utilizes NIRGreenBlue bands and NDVI integration, enhanced by self-attention mechanisms for improved contextual understanding. We evaluated our approach using multi-temporal PlanetScope imagery from three distinct agroforestry sites in Ecuador, capturing both dry and rainy seasons. This research advances automated live fence monitoring by combining specialized spectral analysis with the robust feature learning capabilities of foundation models, offering an improved solution for sustainable landscape management. The proposed approach aims to enhance detection accuracy while maintaining computational efficiency and supporting practical applications in conservation planning and policy implementation.

Keywords: Deep Learning; Foundation Models; Live Fences; Remote Sensing; Scene Classification; Agroforestry
Comments on this paper
Currently there are no comments available.


Average rating of this article (we haven't received any ratings yet, be the first one!)


 
 
Top