2023-2024 Project List
Goals & Milestones
Director Sync meeting notes with the Autonomy Team Leads: Autonomy 2023-2024 Goals & Milestones
Active Projects
Integrated Monitoring And Command Station (IMACS) 2.0
WARG is creating ground station software to monitor and control drones. The software is a user interface (UI) that allows pilots and ground station operators (GSOs) to quickly read important drone telemetry in real time such as position, orientation, and status of drone components (e.g. battery voltage). It also allows the operator to send autonomous commands.
IMACS 2.0 is a desktop application written in the Dart programming language with the Flutter framework that runs on the ground station computer.
2023-05-18 IMACs Revamp Meeting
Your tasks are to develop and integrate UI widgets into a comfortable user experience (UX) as requested by the pilots and GSOs.
Airside System
Previously Airside System and Jetson.
The airside system is the hardware and software that controls the drone during Search and Landing. The hardware allows the software to run. The software detects possible landing pads, collect their locations, and makes a decision.
The airside system software is written in the Python programming language and can run on any OS, including WARG’s NVIDIA Jetson (and Raspberry Pi, but very slowly). The software also includes drivers for interacting with the hardware attached to the computer.
The airside system hardware includes the computer on which the software runs, the global shutter camera for perception, and the communication link between the computer and the flight controller.
Your tasks are to develop the perception-decision-control loop that runs when the drone arrives at a waypoint. You integrate the system with the camera and the flight controller to receive data and take action by sending commands back to the flight controller.
Pathing
Previously QR Scanning and Mission Planner Integration.
During Cruise, the drone travels between waypoints over a distance of several kilometres. The pathing software plans the drone’s route (i.e. the waypoint order) and changes it dynamically based on internal or external factors (e.g. battery low, time, diversion around an area).
Pathing runs on the ground station computer.
Your tasks are to develop the path planning algorithms to decide the waypoint order of travel. You integrate the system with the communication interface to receive drone telemetry and send commands.
ML Model
The ML model contains the weights required for running inference on camera images. The training images are collected during flight tests, cleaned (manually), labelled (manually), augmented, and organized into a dataset for training.
The majority of your time is spent collecting, cleaning, and labelling images.
Your tasks are to optimize the camera settings by repeatedly collecting images outside (e.g. off E5 balcony, flight tests) in all weather conditions, checking the images, and changing the settings. Once the settings are finalized, you collect images during flight tests to add to the dataset. You manually clean and label the dataset as new images are collected.
Once the dataset is considered large enough, you augment the dataset to produce a training, validation, and test set, which you use to train the ML model.
LTE Communication
For the AEAC 2024 Student UAS Competition, the airside system runs on the ground station computer. An LTE link is used to communicate between the airside system software and the rest of the airside system hardware. A VPN creates a common network in which data can be transmitted and received between the ground station and the drone.
TODO: Documentation
Your tasks are to develop the communication of images from the drone to the ground. On the drone, you send images from the camera over the network, and on the ground, you forward to the airside system images received from the network.
Proposed projects
These projects are not finalized. DO NOT add them in project selection.
Project Assignment
Â
W24
Projects | Members | Notes |
---|---|---|
Autonomy Advisor |
| Â |
Autonomy Leads |
| Â |
IMACS 2.0 | Â | Looking for people to spearhead this project |
Autonomy Airside | @Karthigan Uthayan @David Wu @Dylan Finlay @Ethan Ahn @Eshaan Mehta @nathan.martin @Balaji Leninrajan @Sanjay Seenivasan @Victor Terme @Aritra Kar @Nuzhat Rudba | 9 |
Pathing | @Ohm Patel @Arunav Munjal @Daniel Chenrui Zhang @Hard Shah @Mahan Sharifi-Ghazvini @Jane Zeng @Julia Zhu @Iris Mo | 8 |
ML Model | @Joseph Bagheri @Vibhinn Gautam @Kevin Wu @Yash Gunturi Eshwara Vidya @Harini Karthik | 5 |
LTE Communication | @Maxwell Lou @Jonathan Yuan @Tyler Chen @Cindy Li @Isabelle Huang @Jiwon Kim @Krish Patel | 6 |
F23
Projects | Members | Notes |
---|---|---|
Autonomy Advisor |
| Â |
Autonomy Leads |
| Â |
IMACS 2.0 |
| 5 |
Autonomy Airside |
| 6 |
Pathing |
| 5 |
ML Model |
| 4 |
LTE Communication |
| 5 |
S23
Autonomous Landing
Now part of Airside System.
Research and develop an autonomous landing algorithm to help the drone land on a UAV landing pad. The team will receive data from ZeroPilot (WARG’s in-house flight controller) and the bounding box information from Landing Pad detection model, and will output a movement command to ZeroPilot. The auto landing code will run on the Jetson and interface with ZeroPilot using MAVLink.
Autonomous Landing (Competition)
2023-24 Summer Auto-Landing Project
Projects | Members | Notes |
---|---|---|
Autonomy Leads |
| Â |
IMACS 2.0 |
| Â |
Airside System and Jetson |
| Â |
QR Scanning and Mission Planner Integration |
| Â |
Autonomous Landing |
| Â |