In environmental research, the most compelling stories often emerge at the intersection of technical ingenuity and demanding fieldwork. A recent contribution by Antonio Castañeda-Gómez offers precisely that—an elegant yet powerful visualisation of fire spread, developed from complex UAS data and careful analytical design.
Working with RGB and thermal imagery collected from multiple unmanned aerial systems, Antonio produced a dynamic video that reveals how fire propagates across savanna landscapes. The result is visually striking, but also scientifically grounded—combining temporal precision with spatial clarity in a way that invites both exploration and interpretation.
The data behind this work did not come easily. Our field campaigns in African savannas required coordinated flights of three UAS operating simultaneously, each equipped with thermal sensors to capture subtle variations in fire behaviour. These efforts were carried out by Antonio alongside MSc student Anna Bischof, PhD researcher Luisa Pflumm, and Dr. Mirjana Bevanda.
Collecting consistent, high-quality data under such conditions is already a significant achievement. Yet, as is often the case, the real challenge began afterwards.
Transforming raw imagery into a coherent, informative animation required a different level of expertise. Antonio developed the workflow primarily in R, drawing on a suite of packages including magick, ggplot2, exiftoolr, purrr, lubridate, and dplyr.
Antonio created an R-based workflow that transforms raw drone imagery and flight logs into fully synchronized, telemetry-rich visual outputs ready for video production. The script automatically extracts metadata from each image, aligns it with the closest flight log entry in time, and merges everything into a single structured dataset. From there, it generates clean, cinematic frames that display key flight information such as altitude, speed, GPS position, gimbal orientation, battery status, and heading directly on top of the imagery. The result is a streamlined process that turns standard UAS data into professional, information-enhanced visuals suitable for analysis, presentations, or storytelling.
Through this pipeline, individual frames were not only stitched together but enriched with flight specifications—integrated dynamically to provide context without overwhelming the viewer. The outcome is not just a visualisation, but a layered narrative of the fire event itself.
This work forms part of a broader research initiative led by Tobias Ullmann, Mirjana Bevanda, and Luisa Pflumm, funded by the Deutsche Forschungsgemeinschaft. The project focuses on understanding the interplay between fire dynamics and water stress in African savannas—an area of growing importance under changing climatic conditions.







