Loading...
Thumbnail Image
Publication

Development of an innovative and efficient urban vegetation monitoring system for sustainable urban ecology

Abstract
In the rapidly urbanizing environment of New Zealand, monitoring and preserving urban green spaces are crucial for ecological balance and sustainability. Urban vegetation significantly contributes to the ecological and social fabric of city life, making its distribution, health, and diversity essential for effective urban management and environmental protection. This research addresses a significant gap in current methodologies for monitoring urban greenery by introducing a more efficient, accurate, and less labor-intensive approach using deep learning technology. Extensive data collection, preprocessing, and training were key components of this research. Image collection routes were meticulously planned to ensure a comprehensive dataset representing diverse urban vegetation. A major challenge was the quality and privacy concerns within the dataset. Continuous image capturing in urban areas led to issues like blurry, repetitive, and nonvegetative images. To mitigate these issues, diverse filters were introduced to enhance the dataset's quality and ensure privacy protection. Focusing on advanced computer vision systems, this thesis employed the YOLOv5 and YOLOv8 models for automated urban vegetation detection using RGB images from car-mounted cameras. The models were chosen for their effectiveness in image recognition, with various image augmentation techniques and annotation tools enriching the training process. The thesis also explored transfer learning and layer freezing techniques in model training, along with a unique method of using larger bounding boxes for tree annotations to enhance the detection of specific tree species. The models' performance in vegetation detection was evaluated through precision and recall rates, highlighting their reliability and efficiency, with YOLOv5x demonstrating the highest precision (0.93) and recall (0.90) for urban vegetation detection. In tree species detection, the YOLOv5m with Transfer Learning model achieved a strong balance of precision (0.95), recall (0.77), and accuracy (0.86), while YOLOv8m with Transfer Learning excelled in recall (0.93), proving useful for applications requiring high true positive identification. For specific tree detection, the YOLOv8m_Bigger_size model led with high precision (0.95), recall (0.98), and the top mAP50-95 score (0.79) in validation. These results underscore the models' applicability in diverse urban scenarios and reinforce the feasibility of this approach for large-scale urban vegetation monitoring. The research findings indicate that deep learning technology, particularly the YOLOv5m and YOLOv8m models, significantly enhances the efficiency and accuracy of urban vegetation monitoring. The innovative use of car-mounted cameras for data collection marks a significant advancement over traditional methods, enabling the capture of large-scale urban vegetation data with minimal intrusion and high efficiency. In summary, this study demonstrates that applying advanced computer vision systems can markedly improve urban vegetation monitoring. The integration of YOLOv5 and YOLOv8 models in New Zealand's urban landscapes offers a novel approach to address the challenges of urbanization. The models are adaptable for use in other New Zealand cities with access to similar urban vegetation datasheets, crucial for evaluating model accuracy and enhancing precision in future models. This thesis advances our understanding of urban vegetation monitoring using computer vision and highlights the potential of deep learning models in enhancing environmental conservation efforts. The insights and methodologies presented provide valuable tools for urban environmental management and suggest new directions for future research and applications in this critical area.
Type
Thesis
Type of thesis
Series
Citation
Date
2024
Publisher
The University of Waikato
Rights
All items in Research Commons are provided for private study and research purposes and are protected by copyright with all rights reserved unless otherwise indicated.