Running Airside on the Raspberry Pi 5 - Comp 2025
- 1 Current branch:
- 2 Hardware Connections
- 3 Ground Testing
- 3.1 Setup repository
- 3.1.1 Clone Repo
- 3.1.2 Setting the right configs in config.yaml
- 3.1.2.1 video_input
- 3.1.2.1.1 OpenCV camera (regular RGB USB camera) camera_enum: 0
- 3.1.2.1.2 Picamera2 camera_enum: 1
- 3.1.2.2 detect_target
- 3.1.2.2.1 YOLO ML model option: 0
- 3.1.2.2.2 Bright spot detection option: 1
- 3.1.2.3 flight_interface
- 3.1.2.4 data_merge
- 3.1.2.5 geolocation
- 3.1.2.6 cluster_estimation
- 3.1.2.7 communications
- 3.1.2.1 video_input
- 3.2 Run integration tests
- 3.2.1 Test connection to Pixhawk
- 3.2.2 Worker + camera tests
- 3.3 Run Airside
- 3.3.1 Bad signs
- 3.3.2 Good signs
- 3.1 Setup repository
- 4 Real Flight Tests:
- 5 Ground side program
Current branch: GitHub - UWARG/computer-vision-python at houston-testing
Hardware Connections
#TODO: Get pictures and better explanations for these. idk how to explain this well
The RPi is connected to the RPi Interface Rev B Comp 2025 Variant End User Manual (I linked the user manual), by sticking all of the pins together. It should be already connected.
The RPi gets powered by the 3S LiPo battery (thin/small one). Plug the battery into the main drone plug, and there’s another one that goes from the main plug to the RPi Interface.
Connect the UART cable from the RPi Interface onto the TELEM2 or TELEM3 port (depending on what’s setup for that specific drone/Pixhawk). On Houston, it is TELEM2.
Connect the killswitch line from the RPi Interface to the Pixhawk’s side GPIO pins (This is the 3 pin connector - J10). There are 8 GPIO ports on the Pixhawk, so it depends on which drone and how those are setup. On Houston, plug it into 8.
Ground Testing
Setup repository
We will clone the Airside System Repository onto the RPi.
These steps will clone the latest main branch of the remote’s Ariside repository and the latest main branch of the remote’s common repository. Advanced tuning is at your own discretionary
Clone Repo
You should know how to clone the repo, see Airside System Repository or Autonomy Workflow Software if you’re confused.
cd ~
git clone https://github.com/UWARG/computer-vision-python.git
cd computer-vision-python
python -m venv --system-site-packages ./venv
source ./setup_project.sh
Setting the right configs in config.yaml
The rest of this guide assumes you’re in the computer-vision-python
folder and activated the virtual environment unless stated otherwise. Also, this section may be outdated by the time you read this, follow what the lead says rather than these suggestions.
Example config file: https://github.com/UWARG/computer-vision-python/blob/houston-testing/config.yaml (As of Mar 20, 2025, this one’s detect_target
section is slightly outdated. Please see below for correct settings)
video_input
You may need to change this section on the fly during the flight test, so make sure you know how to edit files using the linux command line!
OpenCV camera (regular RGB USB camera) camera_enum: 0
Set
width: 1920
Set
height: 1200
Set
camera_config.device_index: 0
Picamera2 camera_enum: 1
See appendix C of Picamera2 Library documentation and common’s camera
module to know what configs are available in camera_config
, and what the values should be. A lead should usually tell you ahead of time what settings we want to test during the flight test.
Set
width: 1920
Set
height: 1200
Important camera settings for
camera_config:
exposure_time: 10
(In us. Min: 36, Max: ?, Default: ~1000-2000?)analogue_gain: 64.0
(Min: 0, Max: 64, Default: 1)contrast: 1.0
(Min: 0, Max: 32, Default: 1)lens_position: null
(focal length, in 1/m. 0 for infinity, null for auto-focus. Default: null)
detect_target
YOLO ML model option: 0
Download the latest model from Landing Pad Models to
tests/model_example/
or anywhere you likeChange
config.model_path
to the new model’s path (relative to the repository’s root).
Bright spot detection option: 1
See Airside System Repository for more information on how to tune the brightspot detection.
Defaults are in comments of the code, and in Airside System Repository .
As of Mar 20, 2025: The defaults are as follows:
brightspot_percentile_threshold: 99.99 filter_by_color: True blob_color: 255 filter_by_circularity: False min_circularity: 0.01 max_circularity: 1 filter_by_inertia: True min_inertia_ratio: 0.1 max_inertia_ratio: 1 filter_by_convexity: False min_convexity: 0.01 max_convexity: 1 filter_by_area: True min_area_pixels: 160 max_area_pixels: 2000 min_brightness_threshold: 50 min_average_brightness_threshold: 130
flight_interface
Change
address
to/dev/ttyAMA0
for the serial port (this is how RPi is connected to Pixhawk)baud_rate
should be57600
Increase
timeout
if you see a lot ofworker died ...
messages in the logs during ground tests (60.0 or 120.0 is usually more than enough)
data_merge
Increase
timeout
if you see a lot ofworker died ...
messages in the logs during ground tests (60.0 or 120.0 is usually more than enough)
geolocation
Make sure
resolution_x
andresolution_y
are the same aswidth
andheight
from video_inputSet
fov_x
andfov_y
according to the installed camera’s field of view in degreesSet
camera_position_[x,y,z]
to the camera’s distance from GPS on the drone (in meters). This is in NED, so positivex
is drone’s front, positivey
is drone’s right, positivez
is drone’s bottom.Set
camera_orientation_[yaw,pitch,roll]
to the camera’s yaw, pitch, roll relative to the upright drone (pointing forwards), in radians. If it’s pointing straight down, then it should beyaw = 0, pitch = -1.57079632679 (-pi/2), roll = 0
.
cluster_estimation
Currently,
min_activation_threshold
must be greater than or equal to 10. For now, set it to 10Set
min_new_points_to_run
to 3 or 5, as long as it doesn’t make the RPi too slow.
communications
Increase
timeout
if you see a lot ofworker died ...
messages in the logs during ground tests (60.0 or 120.0 is usually more than enough)
Run integration tests
Assuming you’ve already tested this on your laptop using the Mission Planner simulator like a good developer, these steps should all go smoothly.
Test connection to Pixhawk
First, run common’s connection_test.py
to make sure you’ve set up the hardware properly (modify the CONNECTION_ADDRESS
to /dev/ttyAMA0
as instructed by the comment).
python -m modules.common.test_connection
The output should be CONNECTED, ...
and not DISCONNECTED, ...
.
Worker + camera tests
Next, run all the integration tests and make sure they pass (i.e. doesn’t throw an error and you see the Done!
printed at the end).
Don’t actually type * in the command line, this just means you should run all of the integration tests in the tests/integration/
folder.
For test_flight_interface_hardware
and test_flight_interface_worker
, please change the MAVLINK_CONNECTION_ADDRESS
to /dev/ttyAMA0
. If it doesn’t work, try doing it outside (yes, go touch grass). A GPS connection may be needed, and you can’t get one inside the building.
As of Mar 20, 2025 the Detect Target integration/unit tests may be failing. Don’t worry, this is because we wanted to implement a better Detect target for Flight test, but didn’t have time to modify the test cases yet.
python -m tests.integration.*
Run Airside
Assuming you’ve already tested this on your laptop using the Mission Planner simulator like a good developer, these steps should all go smoothly. Please be patient, as it takes a while to start up and get some data (if you’ve run it on the simulator before, you would know. And the RPi is probably slower than your laptop).
Please test outside, as you need a GPS connection for Geolocation worker, and consequently Cluster estimation and Communications workers.
python -m main_2025 --cpu
Bad signs
Seeing a lot of
worker died ...
messages in the console andmain.log
.There are a lot of
[ERROR] ...
messages in the worker logs (logs/[worker]_[pid].log
).
Good signs
A new folder for the current time appears in the
logs
folder, and you can see all the pictures the camera took and the pictures with bounding boxes for detections (if you pointed the camera at either landing pad images on your phone or an IR beacon if using IR camera).There is a
logs/[worker]_[pid].log
file for all the workers, and they do not just have 1 line sayinglogger initialized
. Note that you will need at leastmin_activation_threshold
number of detections before cluster estimation even starts, so it may take a while. You can lower this in the config, but remember it must be larger than 10.
Example: a good geolocation_worker_1234.log
This is an example of what a good log looks like after we detect something in detect target, and geolocation worked properly.
12:55:19: [INFO] [C:\Users\Ashish\Documents\Github\computer-vision-python\modules\geolocation\geolocation.py | run | 320] <class 'modules.detection_in_world.DetectionInWorld'>, vertices: [[1.8399680438179167, -3.017375576120407], [1.860959720882021, 1.938110034823547], [-1.636886286529644, -3.0054532965741436], [-1.6140751642718392, 1.9552601983682794]], centre: [ 0.11341 -0.53173], label: 0, confidence: 0.449462890625
Real Flight Tests:
We want the program to start automatically as soon as the drone is powered on, so we do it by using rc.local
.
Setting camera focus
CV Camera (OpenCV option)
Google webcam tester, and make sure you can find the camera.
Set the focus by turning screwing the lens in/out, and focus on something really far away (just hold the drone and point it far)
PiCamera (Picam2 option)
If on auto-focus, then no need to do anything
If on manual focus, set the config to place focus at 25m. (
lens_position
is in units of1/m
!)
Setting live camera feed via VTX
Simply run these commands in the terminal:
export DISPLAY=:0.0
export XAUTHORITY=/home/warg/.Xauthority
This should not be required for screens connected via HDMI.
Setup hotspot
A hotspot is needed for SSH, which you need to code the RPi, see logs, and debug.
This must be set up on the ground (in the bay) many days before flight tests!
Switch on the pi, connect it to a monitor, and connect a mouse and keyboard. See Raspberry Pi 5 for help.
Switch on your phone hotspot and connect to it on the RPi
Note the IP address notification that pops up at the top right (just in case you need it later). Sometimes you can see it on your phone, but other times, you cannot.
In the network settings of the RPi (top right), set your hotspot Wifi to Priority 1 (or just any big number, make sure it’s high enough to prioritize over the other networks) to auto-connect it to your hotspot on bootup.
Connect to your hotspot on the warg laptop (or your laptop, whichever one you plan on using during flight tests)
SSH
In the powershell of the laptop, connect to the RPi:
ssh warg@raspberrypi.local
.If it doesn’t work, then you can use the IP address that you noted down earlier instead of
raspberrypi.local
.
Alternatively, we now have a portable monitor! So you can just skip all of this hotspot and SSH stuff and directly bring a mouse, keyboard, and monitor! But do the SSH steps if you want to access the RPi faster, as you would need to plug in and unplug all of the stuff between each flight.
Setting up rc.local
Edit rc.local
using
sudo nano /etc/rc.local
Paste or type the following lines in (after you have already setup the airside repo in the above steps from the Ground Testing section):
deactivate # if necessary
cd ~/computer-vision-python
source ./venv/bin/activate
python -m main_2025 --cpu &
Ensure that rc.local
is enabled
sudo systemctl enable rc-local.service
You may come across a message saying that you shouldn’t be enabling this service like this, but you can ignore it.
You should see that it says “enabled” or “active” in green, you may have to scroll down or right using the arrow keys. If it is still currently running, it may say active or running somewhere. You can check the status (if it’s enabled, stopped, running, or disabled) using the following:
sudo systemctl status rc-local.service
Stopping the script, which was started by rc.local
sudo systemctl stop rc-local.service
This should stop the program, but still leave the rc-local.service
enabled. You can check again to make sure if you want. So, the next time you power on the drone, it will automatically start the program again.
Ground side program
Now that the airside program (main_2025.py
) is set up (on the RPi), the ground-side program (recieve_statustext.py
) should be started up as well (on the laptop). The airside system will send MAVLink messages to the ground station, and we have a ground side script to receive and parse these messages, generating a KML file. It’s very simple to start if you’re familiar with Mission Planner (Help docs here: https://uwarg-docs.atlassian.net/wiki/x/AYCNhQ?atlOrigin=eyJpIjoiYTUyNThiMzRlYzRmNGIzNTg4OWZlMmE1NDA2YmIyYzciLCJwIjoiYyJ9 )
Details can be found here: https://uwarg-docs.atlassian.net/wiki/x/VQAsqw?atlOrigin=eyJpIjoiZGZkMmY5Yjk2Njc4NGIxMThmZjhkOWFlNTgzNjJhNzQiLCJwIjoiYyJ9 . TLDR is below:
Connect the drone/controller to Mission Planner. There should be a little green drone on the map, and the top left should say “Connected”
Enable MAVLink forwarding
Press ctrl + f → MAVLink → TCP Host 14550 + Enable Write Access
After that, you can close all these windows
Start the script
python -m modules.recieve_statustext
In the case that the groundside script does not run as intended, there’s still another way to get the GPS coordinates from the drone. Extract the logs from the RPi and follow the instructions inPost Processing !