In a significant advance for the safety and efficiency of infrastructure monitoring, researchers at the University of Klagenfurt are testing a pioneering autonomous drone system that could change the way we inspect power lines, bridges and energy facilities. Developed by the university’s Control of Networked Systems research group, the drone navigates complex environments in real time using artificial intelligence and high performance industrial imaging.
Traditionally, inspecting critical infrastructure has required human personnel to enter hazardous or difficult to reach areas, making the process both costly and dangerous. Autonomous drones offer a safer and more efficient alternative. However, achieving the precision needed for high quality data collection has remained a major challenge. This project aims to overcome that limitation.

Eye in the Sky, Brain on Board
Funded by Austria’s Federal Ministry for Climate Action, Environment, Energy, Mobility, Innovation and Technology, the Klagenfurt project equips drones with object relative navigation capabilities. Unlike systems that rely solely on GPS, this AI driven model identifies and interprets its surroundings by extracting semantic information. This allows it to recognise components such as power poles and insulators and orient itself accordingly.
“Precise localisation is crucial so we can compare inspection images across multiple flights,” says Thomas Georg Jantos, a PhD researcher involved in the project. “Our drone does not just see pixels; it interprets them as parts of specific objects. This is essential for conducting reliable inspections.”
To capture the necessary data, the drone uses a compact and powerful USB3 Vision camera from IDS Imaging Development Systems. The uEye LE camera, featuring a Sony Pregius IMX265 global shutter sensor, delivers high resolution images of 3.19 megapixels at up to 58 frames per second. This is key to enabling real time AI processing.



Compact Hardware with Complex Intelligence
The drone operates using a TWINs Science Copter platform, controlled by a Pixhawk PX4 autopilot and powered by an NVIDIA Jetson Orin AGX 64GB onboard computer. This configuration allows the drone to detect key infrastructure elements, maintain its position and adapt mid flight, all without human intervention.
One of the major technical challenges is fitting the AI system into the drone’s limited onboard computing capacity.
“The processors onboard the drone are still slower than those used during AI model training,” Jantos explains. “Optimising these models for real time use is still a work in progress.”
Despite the limitations, initial test flights conducted in the university’s drone hall have been encouraging. The drone successfully identified and circled insulators on power poles from a distance of three metres, collecting repeatable and high quality image data.
Smart Sensor Fusion for Stable Flight
Flight stability is achieved through a combination of sensors, including an Inertial Measurement Unit, a camera, LIDAR and optionally GNSS. Data from these sensors is integrated using an Extended Kalman Filter, which predicts and refines the drone’s position up to 200 times per second. This allows it to make real time adjustments, even in areas where GNSS signals are degraded or blocked.
“This approach avoids typical GPS issues such as signal loss caused by tall structures or narrow valleys,” Jantos notes. “Instead, we obtain stable positioning relative to the object being inspected.”
Seamless Integration of Systems
The camera connects to the drone’s onboard computer via USB3 and is integrated into the system using the IDS peak SDK. This software makes it easy to acquire images and adjust settings such as exposure and white balance, ensuring consistent image quality in different lighting conditions. For mission planning and execution, the researchers use the open source CNS Flight Stack. This platform manages navigation, safety monitoring and data logging.
Looking to the Future
With systems like this, the future of infrastructure inspection is set to become safer, smarter and significantly more efficient. The combination of AI driven navigation, compact sensor technology and high resolution imaging is pushing the boundaries of what mobile robotics can achieve.
“In mobile robotics research, industrial cameras need to be robust, lightweight, fast and capable of processing data onboard,” Jantos emphasises. “The IDS camera provides all of that and more, making it a crucial part of our AI based inspection system.”
As development continues, the long term goal is clear. These drones are not just designed to fly autonomously; they are being trained to think, adapt and deliver precise data that improves the safety and reliability of the world’s critical infrastructure.
To find out more visit : IDS – Imaging Development Systems