Drone Computer Vision Precision Landing
Building a Vision-Based Precision Landing System for Drones Using ROS2 and YOLOv8
Autonomous precision landing is one of the most challenging problems in aerial robotics. It requires reliable perception, real-time control, and robust handling of imperfect sensors.
In this project, I built and tested a ROS2-based vision-guided landing system using:
- YOLOv8 for landing pad detection
- A low-cost USB camera
- A finite-state machine controller
- Real-time velocity command generation
This article explains the system architecture, design decisions, challenges, and lessons learned.
System Overview
The goal is simple:
Detect a landing pad in real time and guide a drone to align and descend safely onto it.
The system runs as a ROS2 node and follows this pipeline:
Camera → YOLO Detection → Pixel Error → Velocity Controller → State Machine
Key components:
| Component | Technology |
|---|---|
| Middleware | ROS2 (Humble) |
| Vision | YOLOv8 (Ultralytics) |
| Camera | USB webcam |
| Controller | Proportional (pixel-based) |
| Logic | Finite State Machine |
Landing State Machine
To ensure safe behavior, the system uses a finite-state machine:
- SEARCH – Hold position until a pad is detected
- ALIGN – Center the pad in the camera image
- DESCEND – Begin vertical descent while correcting alignment
- FINAL – Slow descent for accuracy
- TOUCHDOWN – Stop motion
This prevents unsafe commands when detection is lost or unstable.
Vision System (YOLOv8)
The perception layer uses a custom-trained YOLOv8 model to detect the landing pad.
Key features:
- Adjustable confidence threshold
- High-resolution inference option for long-range detection
- Detection smoothing using time-based memory
- Works with low-cost cameras
Example detection logic:
det = self.detector.detect_best(frame)
dx = det.cx - image_center_x
dy = det.cy - image_center_y
These pixel offsets drive the controller.
Velocity Controller
The controller converts pixel error into body-frame velocity:
vx = kp * dx
vy = kp * dy
With safety clamping:
vx = max(-max_vel, min(max_vel, vx))
vy = max(-max_vel, min(max_vel, vy))
This ensures smooth convergence without aggressive oscillations.
Typical outputs during alignment:
vx = -0.08 m/s
vy = 0.12 m/s
Hardware Limitations (Important Finding)
One major discovery during testing:
Camera quality dominates detection distance.
Using a low-cost USB webcam:
| Landing Pad Size | Reliable Detection |
|---|---|
| 20 cm | ~1 m |
| 40 cm | ~2–3 m |
| 60 cm | ~3–4 m |
Even with advanced software, 10 m detection is not realistic with low-resolution optics.
To reach 10 m:
- 1080p+ camera
- 6–8 mm lens
- 60–100 cm landing pad
- High-resolution YOLO inference
Software cannot overcome optical physics.
Improving Long-Range Detection
Several optimizations were implemented:
1. Lower confidence threshold
conf_th: 0.18
2. High-resolution YOLO inference
results = model(frame, imgsz=1280)
3. Far/Near control tuning
| Mode | kp | max velocity |
|---|---|---|
| Far | 0.0012 | 0.4 m/s |
| Near | 0.0025 | 0.5 m/s |
4. Increased detection memory
seen_recently < 1.0 sec
This prevents state flapping due to brief detection loss.
Example Runtime Output
During operation:
Target found -> ALIGN
[SIM CMD] vx=-0.08 vy=0.11 vz=0.00 dx=-5 dy=102
Aligned -> DESCEND
This confirms:
- Stable detection
- Correct sign mapping
- Smooth convergence
Results
Real-time detection
Stable alignment
Robust loss recovery
Works on low-cost hardware
✔ Easily extendable to PX4 offboard control
The system is suitable for:
- Indoor drone research
- Autonomous charging pads
- Warehouse UAV navigation
- Academic robotics projects
Next Steps
Planned upgrades:
- PX4 offboard velocity integration
- Gazebo + SITL simulation
- Kalman filtering for detection smoothing
- ArUco marker fallback
- Depth sensor fusion
- Two-stage FAR/NEAR landing strategy
Final Thoughts
This project demonstrates that:
Reliable precision landing is achievable with open-source tools, even on low-cost hardware — if the system is designed correctly.
ROS2 + YOLOv8 provides a powerful foundation for real-world autonomous aerial robotics.