AI Summary • Published on Dec 2, 2025
Flapping-wing robots, particularly tailless ornithopters like the Flapper Nimble+, generate aggressive vibrations (12-20 Hz) due to their wing actuation. This severe shaking makes camera-based perception extremely difficult, posing a major obstacle to the development of autonomous flight and real-world applications for these agile robots. Existing video stabilization techniques, such as mechanical systems, add undesirable weight and complexity. Software-based methods often rely on feature matching, which can fail due to motion blur from rapid movements, or make inaccurate depth assumptions (like homographies) leading to visual distortions. Furthermore, onboard IMUs on these platforms have been found to be insufficiently accurate for effective stabilization during flight.
Inspired by animal microsaccades, the "Artificial Microsaccade Compensation" method provides real-time video stabilization for ornithopters. This approach directly minimizes changes in image intensity between frames by optimizing 3D rotations within the SO(3) group, using an efficient, equivariant inverse compositional Lucas-Kanade method. This direct method avoids the pitfalls of feature matching and erroneous depth assumptions. The algorithm first estimates rotational deltas between frames and a fixed template. It then computes a stable viewpoint by applying a low-pass filter on SO(3) to the estimated cumulative orientation. Finally, it generates stabilized frames by averaging the last six frames, aligned to the current stable view, which helps mitigate rolling shutter effects and transmission artifacts. A "saccade" mode is also introduced, which holds a fixed viewing orientation for short periods and uses an efficient recursive update to further reduce inter-frame motion.
The developed method successfully produced the first stable, smooth videos from a tailless ornithopter, the Flapper Nimble+. Comparative evaluations against Adobe Premier Pro’s warp stabilizer showed that Artificial Microsaccade Compensation achieved higher quality results and ran in real time, unlike the commercial software which often introduced distortions or failed to stabilize effectively. The algorithm achieved real-time performance of 67-77 frames per second (fps) on a laptop CPU at 240x152 resolution. Quantitatively, it demonstrated an approximate 50% reduction in RMS difference between consecutive output images, with the saccade variant achieving an even more substantial 7x reduction. RMS normal flow magnitude was reduced from approximately 1.1 pixels/frame in unstabilized videos to 0.5-0.9 pixels for stabilized footage and 0.1-0.23 pixels for saccade-stabilized footage. The method maintained a high percentage of valid pixels (over 99% for standard stabilization, and above 97% for saccade stabilization) and significantly reduced the average angular velocity of the stabilized view (30-130 deg/s) compared to the raw IMU estimates (>250 deg/s).
The real-time, distortion-free video stabilization achieved by Artificial Microsaccade Compensation represents a significant advancement for the practical application of agile, flapping-wing robots. By providing stable visual input, it facilitates the development of onboard camera-based perception for autonomous flight and other robotic tasks that were previously hindered by aggressive vibrations. The open-sourced implementation of this method will benefit other researchers and practitioners facing similar challenges with shaking robotic platforms. Future research directions include exploring alternative frame fusion techniques to preserve image sharpness, integrating automatic camera intrinsic calibration, further investigating the utility of IMUs for rough compensation, and extending the application of this stabilization approach to other dynamic robots such as robotic dogs and humanoids.