Black Swift Technologies announced recently, the expansion of a pathfinder mission with NASA’s Goddard Space Flight Center to develop enhanced multi-angular remote sensing techniques using small Unmanned Aircraft Systems (sUAS). These autonomous platforms can more effectively and efficiently monitor crop health and growth monitoring through the use of a narrow spectral band (centered at 531 nm) used to derive vegetation photosynthesis related indices (e.g., CCI “chlorophyll/carotenoid index” and PRI “photochemical reflectance index”) by tracking seasonally changing pigment ratios and photosynthetic rates not capable with established greenness indices (e.g., NDVI).
The pathfinder mission, called MALIBU (Multi AngLe Imaging Bidirectional reflectance distribution function small-UAS), uses Black Swift’s most advanced small Unmanned Aircraft System, the Black Swift S2™ (Figure 1), to capture multi-angle reflectance measurements for land surface studies using multispectral imagers, oriented at different viewing angles. MALIBU’s primary subsystem—a multi-angular sensor array based on the Tetracam Mini-Multiple Camera Array’s (MCA) imaging system—generates science-quality reference data sets suitable for calibration/validation activities supporting NASA’s flagship Earth Science missions.
For nearly 30 years, NASA has been using satellite remote sensors to measure and map the density of vegetation over the Earth. Using NOAA’s Advanced Very High Resolution Radiometer (AVHRR), scientists have been collecting images of our planet’s surface. These images show the density of plant growth over the entire globe. The most common measurement is called the Normalized Difference Vegetation Index (NDVI). NDVI is calculated from the visible and near-infrared light reflected by vegetation. Healthy vegetation absorbs most of the visible light that hits it, and reflects a large portion of the near-infrared light. Unhealthy or sparse vegetation reflects more visible light and less near-infrared light. NDRE (Normalized Difference Red Edge) is an index formulated when the Red edge band is available in a sensor enabling chlorophyll content to be ascertained. While NDVI and NDRE can provide valuable data, advancements in multispectral sensors correct for some distortions in the reflected light caused by airborne particles and ground cover below the vegetation for more accurate imagery and data sets.
While typical multispectral cameras might have five channels, MALIBU’s multi-angular sensor array is comprised of 12 sensors or channels. The primary (port side) Tetracam camera has five channels and the incident light sensor, while the secondary (starboard side) camera has six channels. When combined, the cameras operate as a single sensor suite with a combined field-of-view of 117.2 degrees. The MALIBU channels were specifically chosen to cover the relative spectral response (RSR) of multiple satellite land sensors; such as the Landsat-8 OLI, the Sentinel-2 MSI, both Terra and Aqua MODIS, Terra MISR, Suomi-NPP/JPSS VIIRS, and Sentinel-3 OLCI. By deploying MALIBU several times over a single day, data from multiple solar angles and also multiple observation angles can be obtained, significantly improving the accuracy of BRDF retrievals.1
According to NASA, “By enabling increased understanding of surface directional reflectance variability at sub-pixel resolution, MALIBU will allow NASA missions to assess and improve the retrieval of reflectance-based biogeophysical properties”. This includes vegetation indices, land cover, phenology, surface albedo, snow/ice cover and Leaf Area Index (LAI)/Fraction of Absorbed Photosynthetically Active Radiation (fAPAR), and other terrestrial Essential Climate Variables (ECVs). According to the U.S. Geological Survey, the World Meteorological Organization (WMO) and its Global Climate Observing System (GCOS), Terrestrial ECVs generated from satellites provide the empirical evidence needed to understand and predict the evolution of climate, to guide mitigation and adaptation measures, to assess risks and enable attribution of climate events to underlying causes2.
MALIBU will also provide users with highly accurate imagery of crop status, plant vigour, growth monitoring, pest and disease detection—essential knowledge and data for evaluating crop development, yield maximization, and production efficiency.
“The goal of MALIBU is to capture timely and accurate in-situ data at a fraction of the cost of traditional NASA airborne science platforms,” says Jack Elston, Ph.D., CEO of Black Swift Technologies. “By measuring land biophysical parameters from a cost-effective, repeatable sUAS platform, MALIBU complements NASA’s satellite observations while significantly reducing the logistical and technical complexities of manned aircraft operations in remote geographical regions.”
MALIBU relies heavily on Black Swift Technologies’ proprietary SwiftCore™ Flight Management System to achieve the accurate results the mission outlines. SwiftCore’s high-performance autopilot function allows the science team to deploy MALIBU at both AGL (variable height dependent on terrain) and MSL (near constant height), generating multi-angle reflectance measurement techniques for land surface process studies using sUAS. In initial flight tests, MALIBU was able to capture high angular sampling of surface reflectance anisotropy, (i.e., defined by the Bidirectional Reflectance Distribution Function, another Terrestrial ECV), at 10cm spatial resolution. Sampling of both diurnal and seasonal landscape patterns were achieved under clear-sky conditions (often difficult at high latitudes). Additionally, the quick turnaround between flight deployments (i.e., ready for the next flight in less than one hour), allowed the science team to conduct measurements during a Landsat-8 OLI overpass.
The success of the first field campaign of the MALIBU system demonstrates to scientists and land use planners worldwide the ability to monitor and plan for agriculture, climate and weather changes and effects more accurately and effectively than ever before.