SQF Team swap last night used our live aerial IR. A step-stare camera flying a NIROPS mission at 17500 transmitted live NADIR high-resolution LWIR & RGB and NIR on to the bottom of a big war-room screen and simultaneously showed AI fire polygons instantly as they were processed. Top of the screen was a second thermal camera, an IR/EO drone gimbal (on a manned-aircraft). While sitting in the meeting, I operated it over our no-latency MANET MESH. We’ve streamed live IR over 140 miles LoS. Field people, remote workers, and other offices watched internet streaming IR (1-2 seconds from plane to hand) on iPhones/Androids with real-time heat polygons on a moving map.
The transitioning teams used live visuals of fire in the Kern Canyon throughout the briefing and then crowded around to evaluate, strategize, and plan about where the Rattlesnake and Castle fires could grow together. Multi-gigapixel RGB, NIR, and LWIR ortho-mosaics were post-processed for damage assessment, vegetation burn analysis, dozer track mapping, and were used by Sequoia Nat’l Park to evaluate fire’s effect on Giant Sequoias.
We have more day & night high-res multi-spectral data of active fires this year than ever before. I hope it can be analyzed to better fight these gigafires and help recover the environments.