Riparian planting
Satellite(s)Sentinel-2, Skysat, WorldView. | Monitoring elementVegetation reflectance, LiDAR backscatter. |
---|---|
Description techniqueTo date, most riparian vegetation monitoring techniques have relied on manual imagery interpretation, with data from high spatial resolution spaceborne sensors, aerial photography and UAV used depending on the scale of the target environment (Klemas, 2014, Huylenbroeck et al, 2020). In most cases, high resolution data is necessary for riparian surveys, due to the narrow nature of riparian zones and the need to resolve individual, and often small, vegetation features (Johansen and Phinn, 2006). At 1 to 10 m scales, riparian planting establishment can be monitored from single observations manually, through OBIA (Object Based Image Analysis) or per pixel classification, or by using a time series of a vegetation index to track status in an ongoing fashion. While relatively little research focused on mapping riparian establishment at the scale of individual plants exists, a number of studies have demonstrated the utility of both existing and newly developed techniques for mapping individual vegetation features from high resolution imagery, notably through the implementation of deep learning (e.g Weinstein et al, 2019, Hartling et al, 2019). Hartling et al (2019) combined LiDAR with VHR (WorldView 2&3) to compare a range of classification techniques for detecting and classifying tree species. Results showed that the deep learning method (DenseNet) significantly outperformed the traditional classification methods (Random Forest and Support Vector Machines) with an overall accuracy of 82%, compared to 52% for RF and SVM. Processing of drone capture, including video, has also seen significant development in the last few years, with the application of deep learning object detection being applied to oblique images and video, as well as traditional planform orthomosaics, with promising accuracy. ( https://www.youtube.com/watch?v=7SEfpY3z5Vg) LiDAR may also be used to verify establishment of existing planting via the recent regional LiDAR capture, with the drawback that it represents a single observation. Combination of a airborne LiDAR derived DTM and aerial or UAV derived photogrammetric DSM represents a possible pathway for ongoing monitoring of riparian establishment via height modelling of features detected from the captured imagery. | Accuracy / ResolutionVariable spatial and temporal resolution according to sensors. |
Case studyBelcore et al (2021) demonstrate an end to end workflow for detecting, locating and assigning species classes to trees and shrubs within a riparian environment utilizing RGB and Multispectral UAV sensors. The orthomosaics generated were segmented via an OBIA approach and using the original imagery. Derived spectral indices and textural information, combined with a canopy height model derived from comparing an UAV derived DTM and DSM. The results compared well to field data, with improved accuracy achieved by including multi temporal data to allow phenology to be used to aid species differentiation. Example of LiDAR capture over recently planted riparian zones south of Auckland: https://induforauckland.users.earthengine.app/view/mauku-lidar Example of deep learning applied to aerial RGB imagery to delineate native forest features: https://www.indufor.co.nz/blog/post/mapping-native-forest-using-deep-learning-image-segmentation | |
Also fits domainUAV. | |
Benefits
| Limitations
|
Applicability for NorthlandYes, this approach is applicable to Northland. Monitoring of riparian establishment would benefit from the recent regional LiDAR capture and recent developments in the processing of high resolution imagery, combined with project specific captures. The development of light aircraft camera systems (Cessna-mounted) has also reduced the cost of aerial surveying. This is a good option if planted areas are widely scattered. The system captures colour infra-red photography, which can also be processed to a point cloud using the image overlap. This sort of information assists with streamlining monitoring processes and is used to monitor vegetation height changes. Techniques applying optical data will be limited in coverage and temporal granularity by the persistent cloud cover in the region, particularly during the winter months. Mature cloud-masking techniques are directly available for open access multispectral data (e.g. Landsat and Sentinel-2). When using commercial data, care must be taken to ensure that there is sufficiently cloud free imagery available, as cloud masking is not as mature and ordering a large volume of imagery to ensure complete cloud free coverage between multiple observations can become cost prohibitive. | |
Publication references Huylenbroeck, L., Laslier, M., Dufour, S., Georges, B., Lejeune, P. and Michez, A., 2020. Using remote sensing to characterize riparian vegetation: A review of available tools and perspectives for managers. Journal of environmental management, 267, p.110652. Johansen, K. and Phinn, S., 2006. Mapping structural parameters and species composition of riparian vegetation using IKONOS and Landsat ETM+ data in Australian tropical savannas. Photogrammetric Engineering & Remote Sensing, 72(1), pp.71-80. Doody, T.M., Lewis, M., Benyon, R.G. and Byrne, G., 2014. A method to map riparian exotic vegetation (Salix spp.) area to inform water resource management. Hydrological Processes, 28(11), pp.3809-3823. Weinstein, B.G., Marconi, S., Bohlman, S., Zare, A. and White, E., 2019. Individual tree-crown detection in RGB imagery using semi-supervised deep learning neural networks. Remote Sensing, 11(11), p.1309. Ashqar, B.A., Abu-Nasser, B.S. and Abu-Naser, S.S., 2019. Plant seedlings classification using deep learning. International Journal of Academic Information Systems Research (IJAISR) ISSN: 2000-002X Vol. 3 Issue 1, January – 2019, Pages: 7-14. Belcore, E., Pittarello, M., Lingua, A.M. and Lonati, M., 2021. Mapping Riparian Habitats of Natura 2000 Network (91E0*, 3240) at Individual Tree Level Using UAV Multi-Temporal and Multi-Spectral Data. Remote Sensing, 13(9), p.1756. |