Riparian planting

Satellite(s)

Sentinel-2, Skysat, WorldView.

Monitoring element

Vegetation reflectance, LiDAR backscatter.

Satellite(s)

Sentinel-2, Skysat, WorldView.

Monitoring element

Vegetation reflectance, LiDAR backscatter.

Description technique

To date, most riparian vegetation monitoring techniques have relied on manual imagery interpretation, with data from high spatial resolution spaceborne sensors, aerial photography and UAV used depending on the scale of the target environment (Klemas, 2014, Huylenbroeck et al, 2020). In most cases, high resolution data is necessary for riparian surveys, due to the narrow nature of riparian zones and the need to resolve individual, and often small, vegetation features (Johansen and Phinn, 2006). At 1 to 10 m scales, riparian planting establishment can be monitored from single observations manually, through OBIA (Object Based Image Analysis) or per pixel classification, or by using a time series of a vegetation index to track status in an ongoing fashion.

While relatively little research focused on mapping riparian establishment at the scale of individual plants exists, a number of studies have demonstrated the utility of both existing and newly developed techniques for mapping individual vegetation features from high resolution imagery, notably through the implementation of deep learning (e.g Weinstein et al, 2019, Hartling et al, 2019). Hartling et al (2019) combined LiDAR with VHR (WorldView 2&3) to compare a range of classification techniques for detecting and classifying tree species. Results showed that the deep learning method (DenseNet) significantly outperformed the traditional classification methods (Random Forest and Support Vector Machines) with an overall accuracy of 82%, compared to 52% for RF and SVM.

Processing of drone capture, including video, has also seen significant development in the last few years, with the application of deep learning object detection being applied to oblique images and video, as well as traditional planform orthomosaics, with promising accuracy.

( https://www.youtube.com/watch?v=7SEfpY3z5Vg)

LiDAR may also be used to verify establishment of existing planting via the recent regional LiDAR capture, with the drawback that it represents a single observation. Combination of a airborne LiDAR derived DTM and aerial or UAV derived photogrammetric DSM represents a possible pathway for ongoing monitoring of riparian establishment via height modelling of features detected from the captured imagery.

Accuracy / Resolution

Variable spatial and temporal resolution according to sensors.

Case study

Belcore et al (2021) demonstrate an end to end workflow for detecting, locating and assigning species classes to trees and shrubs within a riparian environment utilizing RGB and Multispectral UAV sensors. The orthomosaics generated were segmented via an OBIA approach and using the original imagery. Derived spectral indices and textural information, combined with a canopy height model derived from comparing an UAV derived DTM and DSM. The results compared well to field data, with improved accuracy achieved by including multi temporal data to allow phenology to be used to aid species differentiation.

Example of LiDAR capture over recently planted riparian zones south of Auckland:

https://induforauckland.users.earthengine.app/view/mauku-lidar

Example of deep learning applied to aerial RGB imagery to delineate native forest features: https://www.indufor.co.nz/blog/post/mapping-native-forest-using-deep-learning-image-segmentation

Also fits domain

UAV.

Benefits

  • Despite the use of commercial imagery (both aerial and spaceborne), remote sensing techniques will still generally be cheaper than field surveying.

  • Aerial and spaceborne based techniques offer a larger, synoptic overview and the potential for ongoing monitoring with greater temporal granularity than can be achieved with site visits.

Limitations

  • When non-target undergrowth is present, imagery which is relatively coarse in resolution will not be able to distinguish the target features, leading to a mixed spectral signature which can be difficult to interpret.

  • Any classification methodology, including deep learning, requires a minimum number of pixels in order to resolve information about the target object. If imagery resolution is too low or the target feature too small, prediction accuracy will be poor.

  • UAV based techniques can be both expensive and time consuming, require site access, and face limitations in area that can be surveyed due to time and battery life constraints.

  • The predictive power of deep learning techniques can be significantly limited by a lack of training data, which in the case of object detection and segmentation techniques, come in the form of hand annotated examples. Generating these training samples in a laborious process and can be difficult if the input imagery only contains a small number of examples (Hartling et al, 2019).

Applicability for Northland

Yes, this approach is applicable to Northland.

Monitoring of riparian establishment would benefit from the recent regional LiDAR capture and recent developments in the processing of high resolution imagery, combined with project specific captures. The development of light aircraft camera systems (Cessna-mounted) has also reduced the cost of aerial surveying. This is a good option if planted areas are widely scattered. The system captures colour infra-red photography, which can also be processed to a point cloud using the image overlap. This sort of information assists with streamlining monitoring processes and is used to monitor vegetation height changes.

Techniques applying optical data will be limited in coverage and temporal granularity by the persistent cloud cover in the region, particularly during the winter months. Mature cloud-masking techniques are directly available for open access multispectral data (e.g. Landsat and Sentinel-2). When using commercial data, care must be taken to ensure that there is sufficiently cloud free imagery available, as cloud masking is not as mature and ordering a large volume of imagery to ensure complete cloud free coverage between multiple observations can become cost prohibitive.

Publication references

Huylenbroeck, L., Laslier, M., Dufour, S., Georges, B., Lejeune, P. and Michez, A., 2020. Using remote sensing to characterize riparian vegetation: A review of available tools and perspectives for managers. Journal of environmental management, 267, p.110652.
https://www.sciencedirect.com/science/article/pii/S0301479720305843?via%3Dihub

Johansen, K. and Phinn, S., 2006. Mapping structural parameters and species composition of riparian vegetation using IKONOS and Landsat ETM+ data in Australian tropical savannas. Photogrammetric Engineering & Remote Sensing, 72(1), pp.71-80.
https://www.ingentaconnect.com/contentone/asprs/pers/2006/00000072/00000001/art00007?crawler=true&mimetype=application/pdf

Doody, T.M., Lewis, M., Benyon, R.G. and Byrne, G., 2014. A method to map riparian exotic vegetation (Salix spp.) area to inform water resource management. Hydrological Processes, 28(11), pp.3809-3823.
Klemas, V., 2014. Remote sensing of riparian and wetland buffers: an overview. Journal of coastal research, 30(5), pp.869-880.
https://onlinelibrary.wiley.com/doi/10.1002/hyp.9916

Weinstein, B.G., Marconi, S., Bohlman, S., Zare, A. and White, E., 2019. Individual tree-crown detection in RGB imagery using semi-supervised deep learning neural networks. Remote Sensing, 11(11), p.1309.
https://www.mdpi.com/2072-4292/11/11/1309

Ashqar, B.A., Abu-Nasser, B.S. and Abu-Naser, S.S., 2019. Plant seedlings classification using deep learning. International Journal of Academic Information Systems Research (IJAISR) ISSN: 2000-002X Vol. 3 Issue 1, January – 2019, Pages: 7-14.
http://dstore.alazhar.edu.ps/xmlui/bitstream/handle/123456789/279/ASHPSCv2.pdf?sequence=1&isAllowed=y

Belcore, E., Pittarello, M., Lingua, A.M. and Lonati, M., 2021. Mapping Riparian Habitats of Natura 2000 Network (91E0*, 3240) at Individual Tree Level Using UAV Multi-Temporal and Multi-Spectral Data. Remote Sensing, 13(9), p.1756.

https://www.mdpi.com/2072-4292/13/9/1756