This session will feature academic-based presentations using remotely sensed data to map and monitor natural ecosystems.
2:00 – 2:15 PM – Forestry Using Next Gener+F87:F92ation Aerial Geiger-Mode Lidar
The NC DWR is currently working with several stakeholders in the Yadkin Pee Dee River Basin to establish new rules on nutrient loads to improve water quality in High Rock Lake, NC. Measuring the impact of management interventions is critical to assess the effectiveness of large financial investments and continuously improve nutrient management strategies. Remote sensing data from satellites and drones can be translated into time series maps of chlorophyll-a (Chl-a), total suspended solids (TSS), and color dissolved organic matter (CDOM) that cover the entire lake. These maps can supplement sparse in-situ water quality datasets and enable more holistic and comprehensive management plans that consider spatial and temporal variability, patterns, and trends. In this presentation, we will describe a multifaceted research project that aims to map water quality in High Rock Lake using medium resolution Sentinel-2 imagery, high resolution PlanetScope imagery, and UAV-collected imagery. We have focused on mapping Chl-a, the parameter used to regulate nutrient concentrations and considered a measure of algal bloom occurrence risk. We will describe the statistical models used to relate remote sensing reflectance to Chl-a and present Google Earth Engine Applications and an ArcGIS data portal and dashboard that we are using to disseminate results. We regularly meet with state water quality managers to solicit feedback and iteratively develop these public-facing data repositories. This project is funded by the North Carolina Attorney General’s Office Environmental Enhancement Grant Program.
Leila Hashemi-Beni, NC A&T University
2:15 – 2:30 PM – Identifying Post-Disaster Damages in Anna Maria Island, Florida, with Aerial Images and Volunteers
Hurricane Milton was a category 3 storm that made landfall on 10 October 2024 in Florida. The 120-mph wind left a trail of damage in several areas in the state. The National Oceanic & Atmospheric Administration (NOAA) collected thousands of high-resolution aerial images of the impacted areas. These post-disaster images had information necessary for assessing damage to buildings, infrastructure, and other properties. School of Computing at University of Wyoming, organized mapping sessions for volunteers to identify damages off NOAA aerial images. Damage assessment maps generated with volunteered information were uploaded to the International Charter on Space and Major Disasters’ website for distribution to emergency management agencies. This presentation will provide an overview of organizing these mapping sessions, training volunteers to identify different types of damages, and post-processing the interpreted data for Anna Maria Island prior to generating damage maps.
Devon Borthwick, University of Wyoming
2:30 – 2:45 PM – Hyperspectral Based UAS for Coastal Vegetation Monitoring and Management
Vegetation Classification is fundamental in exploring and assessing distinct plant species. The preservation of native species and removal of invasive species is crucial for a sustainable ecosystem. The main objective of this research is to analyze the spatial distribution of the native and special status species and conserve them from extinction. Additionally, the research is also focused on identifying the spatial extent of invasive species to develop targeted plans and actions for the removal and inhibit further spread. Â The research used Jupiter Inlet Lighthouse Outstanding Natural Area (ONA) for the case study that is in the northern Palm Beach County, covering 120 acres of the coastal landscape owned and managed by the Bureau of Land Management (BLM). The ONA is an urbanized coastal area that serves as a habitat for various native and special-status species. BLM has identified and reported nine special status plant species in the site are on the verge of extinction. In addition, the Florida Exotic Pest Plant Council identified thirty-eight invasive plants in ONA that can disrupt the native and special status species. Due to the limited spatial extent of the plants, the identification is traditionally done manually which is laborious and time-consuming. While remote sensing methods provide obvious alternative to manual methods, challenges exist in acquiring data with high spatial and spectral resolutions. This research will demonstrate a method of using a UAS based hyperspectral camera to overcome those limitations. The multispectral camera with fewer bands limits the separability of vegetation species. To overcome these limitations, the hyperspectral sensor is deployed. A hyperspectral mapping system is built with a Resonon sensor, which can record the reflectance of electromagnetic radiation in 447 bands with a bandwidth of 1.9 nm over a wavelength range of 400-1000nm. This research will discuss and demonstrate the advantages of using higher number of spectral bands with broader wavelength range in classifying the coastal plants by comparing the results with multispectral data. The data collected using the hyperspectral UAS was processed using both ground truth data and spectral signatures for classification. The research presentation will discuss the object-based and pixel-based image classification results of the recently collected hyperspectral data of the Jupiter Inlet Lighthouse ONA Â using traditional rule-based models, machine learning models, and deep neural networks along with their respective accuracies. The challenges faced in processing hyperspectral data due to its volume and the implementation of dimensionality reduction over the data along with its reliability in classification is also discussed in this presentation. The research outcomes justify the feasibility of using hyperspectral based UAS for mapping and monitoring native, special status, and invasive species in coastal natural areas by comparing the classification result obtained from the multispectral sensor.
Nithish Manikkavasagam, Florida Atlantic University
3:00 – 3:15 PM – Airborne Lidar Point Clouds Classifications By Bayesian Approach With Prior Datasets And Satellite Images
Accurately identifying coastal plants is required for estimating effects of relative sea level rising. Currently although lidar data providers can label buildings and bare-earth for upland, there are lots of misclassification on coastal wetland. For instances, there are widespread confusion with bare-soil and marshes, and lot of mangrove points labeled as ground. The main objectives are to re-classify lidar point categories produced by lidar data suppliers by our Bayesian approach with WorldView-2 and Sentinel-2 multispectral imagery, and USGS 30m DEM and NOAA C-CAP 30m landcover datasets. The working site of our NOAA ESLR project are covered by 4 Sentinel-2 images, 325 available WorldView-2 images during 2010 to 2124, and 6167 lidar 1500m x 1500m tiles. Because deep neural networks usually require a sufficient training dataset that is extracted from all usable data. It is not operationally feasible for us to produce a robust training set from the hundreds WV2 images and lidar tiles and reclassify lidar point clouds under limited time and resources. We developed a Bayesian approach for the classification. It consists of the three phases. Phase one is to generate four major landcover categories (water, vegetation, bare-earth and other) with multiple temporal vegetation indices, namely Modified Soil-Adjusted Vegetation Index (MSAVI) and Green Chlorophyll Index (GCI), on worldview-2 and Sentinel-2 multispectral imagery by thresholding with confidence. Here we developed Bayesian spatial filters to refine ambiguous pixels by addressing spectral and spatial mixture. For instances, it can separate spectral similar objects such as asphalt paved surface and water, and spatial mixture objects such as breakwaters and water. Phase two is to identify bare-earth areas and waters by refining classification of lidar data suppliers by the entropy filter and our water classification algorithms, respectively. Then, we also correct mislabel grounds within wetland limits with USGS 30m DEM dataset by morphological filters and statistics based on connected pixels. Thus, we obtain reliable water, ground and non-ground classification by upland and wetland. Finally Phase three incorporates preceding two results to generate reliable categories of grass, shrub and trees for upland and high confidence categories of marsh and mangrove for wetland by Bayesian spatial filters with prior knowledge of NOAA C-CAP landcover dataset, besides water, ground, and built-up. This classification and generation of underlying surface approach shows promising results with the experiments by Oso Bay, Corpus Christi, Texas. We are going to apply this approach for working site of our NOAA ESLR project, the six-county Coastal Bend of Texas.
Lihong Su, Texas A&M University-Corpus Christi
3:15 – 3:30 PM – Advancing Canopy Height Mapping through Precision3D and GEDI Data Fusion
This presentation will explore the innovative fusion of very high-resolution photogrammetric data from Maxar stereo-optical imagery and lidar data from NASA’s Global Ecosystem Dynamics Investigation (GEDI). By integrating these complementary datasets, we aim to enhance the accuracy and detail of forest canopy height models. Initial results demonstrate a 25% improvement in the accuracy of the photogrammetric height model after fusion with GEDI, particularly in complex and heterogeneous forest environments. This fusion leverages the strengths of both continuous high-resolution optical data and discrete lidar measurements, offering a robust approach to forest monitoring and management.
Levi Madenberg, Maxar Technologies